First, an apologia for a technical complication ...
So, with previous books I've been in the habit of writing up my crib notes and blogging them around the time the book comes out in mass-market paperback and the ebook price is cut accordingly, which is traditionally twelve months after the hardcover publication date. However, we live in interesting times and the mass market distribution channel—used for small format paperbacks in the USA—is decaying (it died in the early 90s in the UK). Upshot: there have been no mass market paperback editions of my books in the USA since about 2015, although the UK market still gets small format trade paperbacks (which everyone thinks is a mass market release).
"Empire Games", the first book of the trilogy of that name, is published by Tor in the USA (and Canada) and by Tor in the UK (and EU, and Aus/NZ). Despite their similar-sounding names, these are actually two different companies within a sprawling multinational (Holtzbrinck Publishing Group), and although my US and UK editors work together, they're publishing through different distribution networks (because, historically, books weren't a valuable enough wholesale product to ship internationally). This is why the ebook price drop and small format paperback have already happened in the UK, but the ebook price cut and US trade paperback of "Empire Games" are delayed until December 5th.
(There are no current plans for a mass market release, despite which Amazon.com are optimistically saying that you can pre-order one for delivery on January 1st, 2099. And book 2, "Dark State", is due out on January 11th in the UK and January 9th in the USA.)
As it's the seasonal affective depression time of year and I always get slammed around the winter solstice, without further ado, here are some crib notes for "Empire Games". Spoilers ahoy!
"Empire Games" is a stand-alone-ish trilogy set in the same universe as my earlier "Merchant Princes" series. By stand-alone-ish I mean it should be possible to read it without familiarity with the previous books ... but a bunch of earlier characters re-appear, and it's probably best read as a continuation, 17 years after the end of the first story. (The working title of the project was, you will be shocked and horrified to learn, "Merchant Princes: The Next Generation".) If you want to go back and re-read the original series, I strongly recommend reading the collected omnibus editions instead. (Link goes to the US "buy my books" page; UK version here.) The original series was written circa 2002-2006; I was still learning how to do long-form narrative fiction properly, and I made a lot of minor improvements when I re-mastered the books in 2012.
In my original, 2002, pitch for a four-book series of big fat novels, there was a lump of text titled "The Family Trade"—eventually reassembled and re-released in its planned form as "The Bloodline Feud", for reasons I think I discussed here a few years ago—then three more books. The second book kind of expanded, and is now "The Traders War" and "The Revolution Trade". The entire "Empire Games" trilogy corresponds to book 3 of the original plan, but no plan ever survives contact with the enemy, let alone 15-16 years of wall-clock time and the march of history.
To recap: we live in a multiverse. There are parallel universes, mostly containing versions of Earth, and some people, equipped with a very high-tech bit of intracellular nanomachinery, have the ability to think themselves across to other distinctively different versions of Earth. Our first series revolved around Miriam Beckstein, an adopted 30-ish tech journalist from Boston who discovered she had an extended family the hard way and did what anyone in that situation does, namely tried to improve things and inadvertently triggered a revolution and a nuclear war in two time lines. Series crib sheet here.
"Empire Games" opens in the near future of 2020—please don't blame me? This book was originally planned for publication in 2014—with a rather different adopted child, Rita Douglas, being taken aside by Men in Black and made a job offer she isn't allowed to refuse. It transpires that in the Grim Meathook only slightly glow-in-the-dark Future-USA that emerged from the events at the climax of "The Revolution Trade", the US government has become totally paranoid about world-walkers, and has handed over responsibility for securing the nation against threats from all other parallel universes to the Department of Homeland Security who, in a classic example of Mission Creep have begun to invade and occupy other time lines, engage in large-scale resource extraction (and prototype carbon sequestration operations, by dumping waste CO2 in neighbouring uninhabited Earths), and turned the USA into the kind of total surveillance hell-hole that it turns out (pace Edward Snowden), it already was. (Did I say that writing plausible near-future SF is getting really hard these days?)
Rita, we discover, is Miriam Beckstein's daughter by her first husband, a non-world-walker. As such, she is a carrier of the inactive trait and of no interest to anyone, except that a certain shadowy national laboratory has now worked out how to switch on the world-walking ability in carriers. A couple of decades earlier the Clan was using a fertility clinic in Massachusetts to produce thousands of carriers, in an attempt to breed their way out of a demographic collapse (due to internicene feuding). DHS has got the Clan breeding program records (aka DRAGON'S TEETH), but Rita is a few years older, and unlike the youngsters, her background profile scores very highly for a particular job: paratime spy. She does, admittedly, come with a few drawbacks. For one thing there's her distinctly odd upbringing and her close relationship with her adoptive grandfather Kurt, who defected from East Germany in the late 1960s. For another ... just pay attention to Rita's ethnicity, gender identity, and politics. Her DHS handlers are all too pleased when she is reunited with an old flame late in her training/activation, because it means they've got a handle on her. But perhaps they aren't paying quite enough attention to the nature of the summer camp where Rita and Angie met, or the way their grandparents all grew up in East Germany. After all, who cares about what they may have gotten up to back in a Communist state that collapsed three decades ago?
We fairly rapidly discover that the reason Colonel Smith's unit within the DHS (yes, that Colonel Smith: the original Family Trade Organization was absorbed by DHS and more or less took it over, immediately after the events in "The Revolution Trade") want a human paratime spy is because in the course of their routine exploration of parallel time-lines, most of which are uninhabited or occupied only by paleolithic tribes, they've discovered something excitingly different—a time line with electrified freight railroads, air defense radar, and nuclear-tipped surface-to-air missiles that can shoot down high-altitude drones. For nearly two decades they've suspected the Clan survivors are out there, hiding somewhere in paratime. If they're in this new time line, the approach they used on to the Gruinmarkt at the end of "The Revolution Trade" isn't going to work, and they don't know enough about the target time line to even know what language they speak. Hence Rita's recruitment and activation.
Now let's step through the looking glass ...
Miriam Burgeson (she re-married in exile) is now a middle-aged high-ranking political apparatchik in the government of the New American Commonwealth, the political entity which supplanted the former New British Empire of Time Line Three in the original series, after the revolution that Miriam inadvertently bankrolled (during a fiscal crisis and national bankruptcy a couple of hundred kilograms of gold will go a very long way). In fact, she's one half of a Clinton-esque power couple; as Party Commissioner in charge of the Ministry for Intertemporal Technological Intelligence and head of the Department for Paratime Research (industrial espionage via world-walker) she has both a mission and the leverage to drag the Commonwealth kicking and screaming into the Century of the Fruitbat; and as Commissioner in charge of the State Ministry of Propaganda, her husband Erasmus has the media through which to disseminate her message: "The Americans are Coming." Miriam has dedicated her past two decades to ensuring that when the USA makes first contact with the Commonwealth, it won't be a one-sided struggle.
As William Gibson famously observed, "the future is already here: it's just unevenly distributed". On first visiting the Boston of the New British Empire, Miriam assumed it was an undeveloped, quasi-19th-century world: much steampunk, very zeppelin. But, just as a time traveller to our own 1942 wouldn't necessarily see any signs of the radar, computers, jet engines, ballistic missiles, or nuclear weapons that were all developing rapidly in secret laboratories, Miriam was unaware of the true state of tech in time line three. We meet her now, 17 years later, interrupting the official opening of an 8-bit microprocessor fab line to take a call informing her that the Commonwealth's first spy satellite has made orbit. Rita's 2020 USA is still five decades ahead of the Commonwealth, but the Commonwealth has computers and nukes and world-walkers. And the Commonwealth has something else, something the USA once probably had but has forgotten: a revolutionary democratic ideology and a deep state that believes it has a mission to spread democracy across the multiverse, starting with the USA.
What the Commonwealth means by democracy and what that word is commonly understood to mean in Rita's time line (and ours) are not the same thing at all. Time line three never had a French revolution, or an American war of independence, or a Russian revolution, or an Iranian Islamic revolution. When Miriam gave Adam Burroughs, the leader of the Revolutionary Party histories of all those revolutions and more back in 2003, he decided to learn from other revolutions' mistakes, and so far he's succeeded: the Commonwealth is rapidly outstripping the rival French Empire (capital: St Petersburg) which dominates Africa and Eurasia in this time line, and has its own paratime exploration program to rival the USA's. The future of Democracy looks bright. But the First Man has stage IV cancer, and as he lies in his sick bed it is becoming apparent that the Commonwealth is about to undergo its first change in executive power since the revolution. (The First Man's constitutional powers correspond more to those of the Grand Ayatollah in the Iranian system than to a US president: it's a position occupied for life, by the system's supreme constitutional judge and ceremonial head of state.) Various Party Commissioners are jockeying for position, and across the ocean, the King in Exile is looking to the turbulence as an opportunity for restoration of the monarchy (as happened with the collapse of the English Protectorate in 1660).
A paratime superpower with nuclear weapons and world-walker technology is about to undergo its first serious succession crisis, just as a US government paratime agency that knows none of this stumbles across them and decides to send in the spies. What could possibly go wrong? Read "Dark State" and "Invisible Sun" to find out, because the novel "Empire Games" is only the first third of the story ...
Now for some loose ends.
Mike Fleming from the previous series does not appear in this one. It seems likely that, in the wake of 2003's events, anyone trying a Snowden-style document dump would have been dealt with mercilessly. And Rita's USA is the sort of place where internet censorship goes unquestioned, there's a national ID card system backed by a comprehensive DNA database, there are CCTV cameras on every street corner, and people sometimes disappear in night and fog, vanishing from the social graph without anyone asking questions. (I wrote this pre-Trump, as an attempt at a Grim Meathook Future USA. I gather some folks now read it as optimistic, because at least my GMF-US government is run by competent people who have good reason to be extremely paranoid. Hah fucking hah, joke's on me.)
Iris, Miriam's mother, is dead. (She'd have been in her 70s at this point, and had advanced multiple sclerosis at the time of the first series.) Rita's adoption was Iris's idea, and should be viewed in light of the Clan's internal braid politics and Iris's feud with her own mother, the now-dead Dowager Hildegarde. (Arranged marriages to conserve the world-walking trait pit mothers against daughters on behalf of grand-daughters. By raising Miriam in the USA—where Miriam bore a non-world-walker child—the rebellious Iris has mortally sinned against Hildegarde's dynastic ambitions. However, Iris is wary enough to plan for future contingencies. If Miriam is ever reintroduced to the Clan, she can't be allowed to have a bastard in tow. Hence the arm-twisting-induced adoption.)
Iris's social circle in the 1970s in Boston is mostly countercultural, which is how she meets Kurt Douglas (Anglicized name, after emigration). Kurt's son and wife are first generation products of the Wolf Orchestra (about which, more in the crib notes for "Dark State", next year). Purely by coincidence, this means they have exactly the set of attributes Iris is looking for in foster parents: outwardly law-abiding, quiet, and risk-averse, but privately non-conformist. They'll teach the kid to keep her head down and move under cover, along with other more recondite skills, like fooling polygraph tests and conducting dead-letter drops. What could be wrong with that?
In the back story, the US population lost their collective shit in 2003 (understandably so: the events of 7/16 dwarfed those of 9/11) and didn't get it back together until 2012 or later. I haven't inked in the political time line between 2003 and 2020 in any detail, however there is no President Obama in Rita's world (and no President Trump either). After 2012 power alternates between the authoritarian/national security wing of the Democrats, and the Republicans (among whom the Christian dominionists are making gains, hence the mention of the "no choice" states that Rita refuses to set foot in without good reason). Dominionists aren't the only believers making gains: the Church of Scientology is sufficiently respectable that membership isn't an automatic strike-out for a security clearance, and they're represented in the Homeland Security paratime organization. As for the LDS and parallel universe versions of Earth, there has been a doctrinal amendment and consequently DHS is crawling with members of the Mormon Church hoping to find a set of golden tablets buried in another version of upstate New York. (Much like the CIA). (That DHS has also been penetrated by two other groups, and who they might be, is a spoiler for a later book. Let's just say, if you set up a sprawling rapidly expanding bureaucracy it will become a honeypot for entryism.)
The USA of Rita's world is much less engaged with the outside world, but has directed considerable energy into resource extraction from uninhabited time lines, and its energy economy is still largely carbon-based (hybrids and electric cars are still a niche, rather than taking off in a big way). It's also a world where the 2007/08 financial crash didn't happen. Instead, there was a smaller non-global crash immediately after 7/16 in 2003, then a larger crash after the India-Pakistan nuclear war (World War 2.5). Since then, the recovery has been more focussed on wealth creation, as vast new natural resource deposits have become available in empty parallel earths.
The Dome in the Forest, and the Gate to Nowhere: this gets a lot of coverage in "Dark State". You'll just have to wait to find out where it goes.
Oh, and in the first series? World-walkers were limited to more or less what they could carry by hand, until the late re-discovery of the ability to use an electrostatically insulated carrier, like a wheelchair or a cart, to move larger quantities between time-lines. In "Empire Games", the bar has been raised somewhat higher.
Final-final note, because I forgot to say it earlier: the first series was, thematically, about family and also about economic development traps. This new series is about how to get out of economic development traps, and about the political equivalent of an economic development trap (a suboptimal strange attractor that keeps sucking a polity backwards into autocracy), and whether our idea of democracy as an ideal form of government is actually valid (without veering off into neo-reactionary/monarchist/totalitarian territory). Put it another way: in our universe, nobody has ever tried to implement Rawls' theory of justice or realtime central planning a la Cybersyn in a revolutionary republic. Or tried to build Project ORION). But there's always a first time ...
(Yet another technical complication: some of the hrefs of your links to wikipedia (Snowden, Dragon's Teeth, theory of justice, project Orion) are missing the closing ")". A common Markdown problem, escaping the braces with a "\" seems to be the answer.)
Fixed, I think. Thanks.
s/defected from East Germany in the early 1980s/defected from East Germany in the late 1960s/;
Thanks for that. Whets my appetite for the rest of the trilogy, unfortunately there is a longer than desired timespan between appetizer and the main course, but I will manage to live through it. As I have been reading through them one at a time since a couple of years after the first one came out.
BTW have found that reading a Charlie Stross story much improved on a Kindle. A challenging vocabulary is a lot of fun but is less of an interruption on the Kindle as the built-in and internet connected dictionary and Wikipedia make understanding new words easier than putting the book down and picking up a nearby computing device that the dogs are napping on.
Again, a hearty and well meaning thank you for the occasional blog updates and the ongoing expansion of my reading library.
Speaking of development traps, I have the impression that the USA and the Commonwealth are heading straight into Exploitation Colonialism. You extract resources from a place you don't care about and consume them. Research and development or financial innovation come second to just cranking out more supplies from your colonies/alternate timelines.
At least the bright upside is that lack of research into automating industrial processes and making them more efficient will bring back the high-employment era for low-skill labour. You're getting oil for free, steel for cheap and you can literally dump your ecological concerns outside of this reality. So, you build cars!
Side note re. seasonal affective disorder (SAD): Both my wife and my BFF have had lots of success with "SAD lights". There's some clinical evidence that this kind of phototherapy works*, so I suspect it's not just a placebo effect or anecdata in the two examples I know personally.
One thing that surprised me as not being listed in the risks: insomnia. I suspect that's not a common problem for most people, but since blue light can exacerbate any pre-existing problems with falling asleep at the end of the day, it's worth restricting the treatment to early in the day.
Both my wife and my BFF have had lots of success with "SAD lights"
"Hello, grandma, do you know how to suck eggs?"
(Sorry, but this is not, ahem, a new problem for me. Hint: what latitude do I live at? What are my sunrise/sunset times today?)
Gate to Nowhere
Speaking of that, its distal end is said to be located 8,000 km from a planetary-mass black hole that's believed to be the remains of a crushed Earth. Presumably the hole would be at the barycenter of the Earth and the GtN would have opened on the former surface, as its proximal end does on an extant surface.
But the radius of our Earth is a nominal 6,378 km (actually varies a bit from place to place), so maybe the GtN didn't open onto the surface, but to a place in space some 1,600 km up?
I'm pretty sure I double-checked the distance, so ... metric/imperial conversion error creeping in somewhere?
I thought as I was reading -Empire Games- that it was a shame that Adam Burroughs didn't take a page from the George Washington - John Adams playbook and set up a peaceful transition of power while he's alive.
I'd rather that the NAC not get the equivalent of Vladimir Putin in charge (I can't remember the name of the creep who is trying to do just that in-universe) - I'd rather that the NAC not become a People's Republic of Tyranny.
I shouldn't worry, though-- I'm sure that Charlie wouldn't give us a dystopian outcome.
I seem to remember an image of the sort of LED lamp that would probably need welding glasses to approach.
I was tempted to get one but didn't want to blind the neighbours.
metric/imperial conversion error creeping in somewhere?
Dunno. The diameter is ~8,000 statute miles, so maybe that???
https://en.wikipedia.org/wiki/Earth_radius
Charlie,
One huge question: is there any way to get the damn thing as an .epub? If I buy from Barndroppings&IgNoble, it goes on my ereader... and there's no way to back it up. They've gotten it hidden on the reader in some manner I haven't managed to identify yet. I can mount in in Linux... but there's no there there. I'm assume an alternate SD or whatever card, not accessible when it's not running.....
Charlie noted: "Sorry, but this is not, ahem, a new problem for me. Hint: what latitude do I live at? What are my sunrise/sunset times today?"
I really do get that most people who suffer from a condition have researched it and know the issues better than helpful folk like me and get irate when we try to helpfully mansplain. G
I also know my friend who is/was a SAD sufferer had suffered for many years because when she did her initial research, phototherapy was considered a fringe treatment with little scientific support. So I figured I'd rather take the risk you already knew about it than not mention something useful that you hadn't considered or had considered and disgarded.
I am sitting right underneath one of these.
One huge question: is there any way to get the damn thing as an .epub?
Yes.
It's sold via the Google Play Store as an epub ebook, and per Tor policy, should be available without DRM. (If you buy it and find DRM, let me know and I'll bite some ankles.)
Or you can buy a Kindle version, import into Calibre with the usual DRM-stripper tool, and transcode to epub.
That's the one.
If I ask if they are dimmable will I get laughed at? :)
Nope, they're not dimmable.
Didn't think so.
My house is afflicted with dimmer switches left there by the previous denizens. Have dimmable LEDs in most of the fittings now, but thinking it is time to just change the switches as they are always on max anyway.
Can we drop the discussion of SAD (at least until we're a couple of hundred comments further in)? This is a topic for questions and spoilers on "Empire Games", after all.
USA maybe The Commonwealth has to catch up with the parallel USA .....
65W LED? HOLY COW ( I note the AMZN piece doesn't apparently give the actual light output? ) I'm beginning to realise that the "normal" shops are useless, in my quest to replace all my filament-bulbs with LED's, but I only want approx 12-1500 lLumens ( The equivalent, approx, of an old-fashioned 100W-1120W filament blub.
I note yours are "edison Screw" fitmanr, rather than the usual UK bayonet, as well. I'll have a burrow, because. it's time I dumped all the filaments. ( I have discharge srip lights in the kitchen, though )
so how do they treat SAD in the Commonwealth? Do they have decent LED lighting yet? (SCNR. dr&h)
Just pulled a batch of cornbread from the oven, and while it cools I see a new Crib Sheet. Hurrah!
BTW, Charlie, do not worry about the story being set in the so called near future of 2020. "Empire Games" is not our world. I'm concerned that you feel that you have to explain that. It's a sign of how Fandom has become even more bizarrely fragile about their world view than in the past. I've noticed that Fandom has become more and more shrill when their individual world views collide with novels. I see it crippling the field as people like Robinson put out ludicrous books like New York 2140. More and more authors have dropped off my "to buy" list as they jump the shark like that.
The Greeks had the best view of time. They saw Now as a chariot moving into the Future, with us facing backwards, only seeing the Past clearly. If you turned your head to either side, you could get glimpses of the Future, but only glimpses. The act of looking right or left would produce different Futures each time you turned your head to look.
Empire Games is solid, and does not require reading the earlier trilogy to be trapped by the story.
Well done.
can't remember the name of the creep who is trying to do just that
that would be the party's security director, "Keith" something if I remember right.
It's been a while since I read the book, so I may get some of the details wrong/have forgotten them.
If I remember correctly, the commonwealth controls basically North + South America while the French Empire controls Eurasia? My question is: how is industrialization spread in both empires? Is it contained in 1 area for each empire (Europe for the French and N. America for the Commonwealth)? The reason I'm asking is because in our world, Total Fertility Rate and thus migration are tied to industrialization.
Tied to that, how widespread is the AK-47 equivalent? If it's not widespread, why? This gives me an idea of how the empires are held together. In our history, the AK-47 functioned as "the great equalizer" by making it expensive in terms of both money and soldiers to hold the European Empires together. There's a reason European countries lost most of the decolonization wars after WWII. The US and the Soviet Union maintained control by supporting puppet dictators, is that how both of those Empires maintain control?
Here's my final question: how did the Commonwealth not fragment during its independence/revolution? Or did it fragment and was reconquered?
Mexico likewise fragmented (see the purple and red areas) https://en.wikipedia.org/wiki/Mexico#/media/File:Mexico%27s_Territorial_Evolution.png
The US had several close calls during the Articles of Confederation and the Whiskey Rebellion https://en.wikipedia.org/wiki/Whiskey_Rebellion
If I remember correctly, the commonwealth controls basically North + South America while the French Empire controls Eurasia? My question is: how is industrialization spread in both empires?
Demographic transition is a plot point in the new trilogy. And there's an essay on the history of time line three in the back of "Dark State". Main point is, they industrialized late, Malthusian trap due to poor agricultural distribution persisted later, Commonwealth is now undergoing demographic transition and desperately investing in the next-gen human capital they know they're going to need. Meanwhile, the French empire is a continuation of the Ancien Regime, and is falling ever further behind ... and beginning to wise up to the fact (ahem: see book 2).
AK-47 equivalent: less than relevant. What's relevant is ammunition factories and distribution networks. I disagree about the AK-47; I think the true break point was the combination of conscript mobilization by railway networks (see Tuchman, "The Guns of August") and the bolt-action rifle. Right now, time line three is in a strategic nuclear stalemate ... but the French Empire is stuck at piston-engined bombers and A-bombs, gawping bemused at the spysats and SSBNs overhead and under the oceans. They've been massively outstripped in fifteen years flat by a rival who, two decades earlier, was on the ropes.
Imperial fragmentation: one point about slower development is that the mechanisms of industrial age imperial dominion have had time to bed in further, and communication systems are better than in our time line. E.g. the New British Empire has its revolution at the same time as 125mph high speed trains and a reasonable telephone network, as compared to the Russian revolution (few telephones, trains limited to 60mph or less). An isochrone diagram would thus show shorter communication lag across the empire, and reconquest would be faster. (Also: the French failed to recognize the significance of the revolution across the water when it happened, due to the lack of any historical precedent known to them. The Radicals, however, knew about other time lines.)
I would submit that they're already there and have been since their inception. The structure of the NAC is very autocratic and highly tyrannical; any specific Big Man at the top might chose not to exercise the power available to him, but that power exists and checks on it are few. It can enter a failure state that's near-impossible to recover from with one sub-optimal succession, which is of course a major plot point.
Charlie,
I had a long conversation with a friend, explaining the series. He had some interesting opinions:
1) The Commonwealth locals understand the absolute need to secure nuclear weapons (in part, because the world-walkers taught them). The world-walkers will be able to steal some, but they won't have massive retaliation ('merely' a few cities).
2) The US can play an extremely strong military and diplomatic card, by putting embassies into the capitals of that world (with a few massed B-52 flyovers for morale).
3) One advantage of being in the USA's timeline is that they are less likely to use nukes. Hiding close by is always an option. And it's really hard for the USA to place panopticon surveillance on most of the territory of the USA, let alone the entire world.
4) The world-walkers can play the diplomatic game right back at the USA - every country in the world desperately wants world-walking ability.
Well, they got rid of the Tsar, they narrowly avoided having a Reign of Terror more by luck than judgement, now Lenin is about to pop his clogs, apparently without having taken effective measures to protect against Stalin coming next...
Which is rather an extreme level of blindness for someone who has been able to foresee his own coming death for rather longer than Lenin, who unlike Lenin has been mentally unclouded throughout his illness, and who has supposedly studied that period of Earth history in order to avoid making the same mistakes.
Which in turn leads me to wonder if there exists some situation maybe such as the alien originators of the mind-controlled molecular machinery still being around, and having now progressed to the level of being able to implement the machinery-controlled mind...
It strikes me also that there is a possible parallel between the Clan refugees and the Jewish diaspora, which in turn opens the possibility of the making of a particularly undesirable type of political hay out of their presence.
Diasporas happen all the time. What's interesting is how the dispersed integrate within a new country.
Felix Dzerzhinsky who re-founded the "new" communist secret-police, regarded Stalin as "too harsh" (!)
Just how severe is the political repression in Empire Games-United States? Aren't you even allowed to point out in public that the Bill of Rights looks like a dead letter without the Stasi coming down on you?
So the Commonwealth had four revolutions to draw lessons from. French revolution: idealists killed or exiled, replaced by dictator. Soviet revolution: idealists killed or exiled, replaced by dictator. Islamic Iranian revolution: idealists killed or exiled, replaced by hardline religious fanatics. American revolution: did NOT become a dictatorship.
Really can't understand why the Commonwealth revolutionaries didn't just say "What those American guys did? Let's do that."
Maybe Miriam had absorbed too much of the "USA is the source of all the world's problems" ideology from time line one?
Whether or not they're called 'mass market', I still like small format paperbacks if I'm not buying the hardcover of a book - they fit my bookshelves better. Nice to see this out.
SLAVES is the answer to that one, I think?
While the NAC seems to be a one party state with an elected head of state for life (and possibly no local elections...? Can't remember but it would surprise me if they did not) they are more economically democratic. IIRC the microprocessor fab line that is opened is referred to as a cooperative.
So even if their political system follows less democratic forms than we're used to (and let's be honest having them hasn't stopped a global slide towards plutarchy) their economic system would seem to empower everyday workers far more than ours.
Really can't understand why the Commonwealth revolutionaries didn't just say "What those American guys did? Let's do that."
What, have an extremely fragmented nation with such a toothless federal authority that the question of dictatorship does not arise?
Given that the Commonwealth need a strong central authority to drive through their industrialization policy that's not exactly an option.
But revolutions in the real world don't repeatedly go wrong in the same ways because the people running them are stupid or ignorant of history. They see the long-term issues, but are too busy staving off the next crisis, coping with the urgent problems, etc. Having established that power comes from the barrel of a gun, putting the djinn back in the bottle and settling down to a nice stable rule of law is really hard.
What, have an extremely fragmented nation with such a toothless federal authority that the question of dictatorship does not arise?
Exactly. Seems a good tradeoff to me.
Given that the Commonwealth need a strong central authority to drive through their industrialization policy that's not exactly an option.
No they don't. The USA was the world's leading industrial economy by no later than 1900 in this timeline, surpassing Great Britain despite the British head start on industrialization. And that's not just modern historians with hindsight (eg Paul Kennedy) but was recognised at the time (eg Mackinder).
And if you're going to say that the Commonwealth is a special case because it's preparing for war, again no not necessary. The USA out produced everybody else in WW2, and the dictatorships were worse at just about everything from incorporating women into the workforce to designing for mass production to choosing R&D goals.
Post WW2, the Soviet Union couldn't catch up to the USA and Western nations. China has only become a technological powerhouse by joining the free market.
If you want to your society to progress - and not just technically, but also socially - a "strong central authority"is the last thing you want.
The problem with no strong central authority is that you get a situation like Somalia, known worldwide for its scientific and cultural progress. The wrong kind of strong central authority gets you places like Venezuela, where the government seems to be marching the country off the cliff.
I think that rule of law is the key metric: "equality before the law, liberty within the law, nobody above the law." as Charlie put it in EG.
In addition, there is a big difference between developing tech for the first time and knowing where you are going.
A huge number of the inventions of the 19th and early 20th century were dead ends, huge amounts of effort and genius wasted for every good idea that changed the world.
These guys already know what works and the necessary dependencies to get there. What they don't have is enough high tech tooling and skilled workers.
They might be in trouble when they catch up with the USA and find they have forgotten how to do basic research but that's a problem for the future.
The world-walkers will be able to steal some, but they won't have massive retaliation ('merely' a few cities).
Wrong. Nuclear weapons are 1940s tech that relies on 1930s physics; the Empire already had an A-bomb program, as did the French, in the original series. By "Empire Games" both sides are nuked-up to late 1960s USSR/USA cold war levels. And the Commonwealth can bushwhack Rita's USA quite easily: they know where all the cities are, and cities can't dodge.
All you need is a (1949-vintage) B-36 Peacemaker bomber-equivalent, with two extra seats for worldwalkers. Fly across your own friendly skies to within about 3 miles of your target, open the bomb bay doors, begin the bomb run, then switch universes. Worldwalker #2 takes you back over after the bomb drops but before it goes off, and you set course for the next target, an hour or two away as the giant piston-powered/turbojet-assisted/or maybe turboprop bomber flies.
(Surface to air missiles are not an air defense magic wand; unless the USA has rolled out really fast interceptor missiles like the Sprint ABM and put them on a hair-trigger alert, they're not going to be able to block a 1950s-vintage subsonic bomber attacking from a parallel universe.)
So at the point where the ~USA encounters the Commonwealth, it's an instant Mexican stand-off with the initial advantage to the Commonwealth (the ~USA doesn't know where all their cities are).
As for the diplomatic game, that's what the next two books in the series are all about.
Just how severe is the political repression in Empire Games-United States?
It's utterly unlike anything in any post-revolutionary republic in our timeline, because they're in terra incognita, making their own (new) mistakes.
In their case: it's the first revolution against monarchism ever to succeed. The ideas of the Enlightenment are not widespread, so the Vanguard Party isn't promoting communism — it's promoting democracy. They have picked a best-of-breed constitution from our time line (hint: revolution that took place in 1979, constitutional framework that has survived more or less intact despite attacks from the dominant superpower and the equivalent of a world war one experience) and deliberately designed a Deep State that keeps parasites from chewing on the fragile shoots of democracy as they take root in the polis.
Upshot: if you're a royalist or opposed to the ideas of the revolution (liberte, egalite, fraternite), they'll come down on your hard — especially as the royalists are bomb-throwers in league with a hostile foreign power. If you buy into the idea of human rights, a universal franchise, and so on, you're inside the magic circle.
(In other words they're like Fox News's worst nightmare — trigger-happy liberals with guns who refuse to take any shit from totalitarians.)
Really can't understand why the Commonwealth revolutionaries didn't just say "What those American guys did? Let's do that."
Because the American revolution didn't happen in a vacuum — it required a pre-existing radicalized population of independent-minded settlers who'd been marinating in Enlightenment philosophy for a couple of generations, in uneasy coalition with a wealthy slaveowning elite.
The political preconditions for duplicating the American revolution simply weren't there. Also, your constitution sucks. (Supporting evidence, item number 1: President Trump was even possible. Supporting evidence number 2: the Civil War happened. Supporting evidence number 3: the Equal Rights Amendment never passed. I could go on ...)
I think the true break point was the combination of conscript mobilization by railway networks ... and the bolt-action rifle
I thought this came a little earlier than WWI. These elements were basically there for Moltke in 1866, enabling Prussian dominance at Königgrätz and elsewhere.
So even if their political system follows less democratic forms than we're used to (and let's be honest having them hasn't stopped a global slide towards plutarchy) their economic system would seem to empower everyday workers far more than ours.
Yep. Also, they have an elected assembly (the magistracy): it's just that in order to run for the legislature you have to pass muster in front of the Radical Party, i.e. the deep state—main criteria being that you are not trying to dismantle democracy from the inside, by promoting monarchism or totalitarianism, and that you're not trying to exploit the workers (their attitude to "corporations are people" would probably involve rolling on the floor, laughing).
If you think this sort of arrangement is impossible, think again—there's a country out there that runs on this very basis today. (Only you need to do a global search/replace between "shi'ite islam" and "democracy".)
I think that rule of law is the key metric: "equality before the law, liberty within the law, nobody above the law." as Charlie put it in EG.
Yep. And while the USA today pays lip-service to that rubric, in practice people with sufficient money and influence are above the law. Witness Trump's boast during the election that he could walk down Fifth Avenue and shoot someone with impunity. Okay, so he was boasting, but the principle remains, and so do the roughly 90% of prisoners in the USA who copped to plea bargains, mostly because they couldn't afford to defend themselves effectively against bullshit charges.
They might be in trouble when they catch up with the USA and find they have forgotten how to do basic research but that's a problem for the future.
This is covered: MITI enforces an "eat your own dogfood" rule, along the lines of the way Kurchatov permitted use of atomic weapons intelligence in the 1940s — he got to read it and veto unproductive lines of research, but the Soviet A-bomb was otherwise home-grown because he wanted to have a viable native infrastructure (in case the A-bomb spies got put away). The exception is where foreign tech is needed to achieve a specific national security goal involving paratime operations.
At some point I really need to write the essay I have planned on how the Commonwealth is gearing up to produce its own computer and software industry and internet, avoiding the security pitfalls and local minima we've stumbled into, while shaving 20% off the time taken to achieve it. But that's another blog entry, probably for after "Dark State" because it would be a bit spoileriffic at this point.
Interesting .... Some of us attempted to steer in a direction that avoided the disasters we could foresee (and which came to pass), but the engineers and mathematicians lost out to the marketeers and demagogues. I shall be interested to see which aspects you have picked up.
The fact that the USA took over from Great Britain as the premier superpower about then was common knowledge before 1930 - see the end of 1066 And All That.
SLAVES is the answer to that one, I think?
Slavery was a world wide institution at the time of the American revolution, and for the French revolution. The American revolutionaries didn't introduce slavery to the Americas, and they were faster than anyone except Great Britain in abolishing it.
In addition, there is a big difference between developing tech for the first time and knowing where you are going.
Agreed, but how do you distinguish between "already know what works" and what might have worked differently?
For one example from J.E. Gordon, in our timeline the pneumatic tire, which makes road transport vastly more efficient, didn't get invented until after the railways were established. A tire is a really simple piece of tech, probably simpler than a high pressure steam boiler or high tensile railway trunnions, but relies on a rubber treatment that was only discovered by accident.
So are railways really the best way to start your industrial buildup? Or should you go for road transport?
Experimentation beats relying on authority, even if you make more mistakes.
Another example, Miriam is a tech reporter from the early 2000s, when electric cars were largely unheard of and/or a joke (Sinclair C5 anyone?) The idea for electric cars has been around almost as long as internal combustion engined cars. But could we have gone straight to electric cars in 1900?
they were faster than anyone except Great Britain in abolishing it.
And they only required 600,000 deaths to abolish slavery!
Unfortunately, when Reconstruction ended in 1877, slavery was back, albeit under a different name.
They have picked a best-of-breed constitution from our time line (hint: revolution that took place in 1979, constitutional framework that has survived more or less intact despite attacks from the dominant superpower and the equivalent of a world war one experience)
The Iranian constitution? The one that allows:
Imprisoning human rights protesters: https://www.amnesty.org.au/iran-vilifies-human-rights-defenders-enemies-state/
Imprisoning trade unionists: https://www.amnesty.org/en/documents/mde13/6147/2017/en/
Institutionalized discrimination against women and religious minorities in the justice system: https://www.amnesty.org/download/Documents/MDE1327082016ENGLISH.PDF
Execution of homosexuals for being homosexuals, not to mention publically hanging them from construction cranes so everyone gets a better view: https://www.amnesty.org/download/Documents/MDE1327082016ENGLISH.PDF
Interesting choice.
...Looks like I derped out when phrasing my question. With the United States, I meant the one where Rita grew up.
So it's a bit of a mix of China and Iran. Iran for the life-long supreme leader (what's the NAC's version of the Assembly of Experts?) and the Radical Party acting as the Council of Guardians.
The China vibe comes not just because it seems to be a one party state but because like the CPC the NAC is ruled by a party that predates the state. In fact it seems to have built the state apparatus around itself.
Will we be getting a more in-depth look into the political and economic structure of the NAC? I realise it's not the story but would be interesting to read about a revolutionary socialist super power that had a playbook of what things to avoid.
Another example, Miriam is a tech reporter from the early 2000s, when electric cars were largely unheard of and/or a joke (Sinclair C5 anyone?) The idea for electric cars has been around almost as long as internal combustion engined cars. But could we have gone straight to electric cars in 1900?
What you're getting at is path dependency — the tendency to continue down an established path because of the knowledge base associated with it, rather than trying something new.
It's not terribly clear in "Empire Games" but the VTOL rotorcraft Miriam and Erasmus are picked up by during the nuclear alert triggered by the third US recon drone to appear in Commonwealth skies is not a helicopter — it's a Rotodyne, a type of compound gyroplane that showed a lot of promise in the 1950s but ended up as being a Road Not Taken development (because the British government circa 1960 had rocks in its collective head and thought a horribly loud aircraft with roughly the performance of a V-22 Osprey only half a century earlier was suitable as a city-to-city commuter plane but had no military utility).Path dependency (and pork-barrel politics) lock us into sub-optimal solutions like the aforementioned Osprey — five gearboxes flying in loose formation — because the cost of reducing the noise level of the Rotodyne — an "unproven" technology — requires veering off the beaten track, and existing market incumbents don't want the competition. But I digress: the Commonwealth had no pre-existing helicopter industry when the Revolution happened, so had to start from scratch, and had access to the history of all the also-rans from our time line, and threw some money at relatively-low-tech hopefuls.
Electric cars ... I'm not sure, but I think the cost lies mostly in developing high capacity/fast charge Lithium batteries that don't deflagrate under the driver's seat. We can thank tens of millions of laptops and thousands of millions of smart phones for us getting that technology right. Similarly: modern OLED and LCD displays are one of the most widely overlooked insane-high-tech products out there, and took multiple decades to develop to current tech levels. CRTs, however, can be built starting with 1920s technology.
You seem to be under the impression that the constitution of Iran explicitly enforced these conditions rather than them being due to much more complicated factors. The US constitution existed long before women's/minority rights, the decriminalisation of homosexuality and AFAIK it was still around when the US was detaining people indefinitely and torturing people under the guise of "enhanced interrogation".
I'm not saying the NAC is some perfect utopia of human rights but so far from what we've seen of it its pushing equality, democracy and strong workers rights through cooperatives and unions. Forced to make a choice of emigrating to the NAC or the RL United States I'm not sure I'd choose the latter.
They have picked a best-of-breed constitution from our time line There might be a dispute about that.. How about Britain from 1688 onwards? Constitutional monarchy, enormous strains, especially 1792-1803, but it pulled through & gradually enlarged the franchise & became more & more truly democratic, & with a much lower (proportional) body-count than any of the others. Of course, that is partly because the big body-count was 1642-51, & up to 1669 in Ireland.
* Rolls eyes *
Yeah, so an islamic state is a crapsack place to live if you're anyone but a male patriarchal theocrat: that's not the point.
The point is, they found a way to build a republic with democratic forms of lawmaking that was resilient enough to survive an eight year long war with a fascist dictatorship that killed about 2% of the male population, while facing out-of-area threats by two hegemonic superpowers (later one) and an ideologically hostile regional superpower (Saudi Arabia). They're still having elections, on their own terms.
You (and I) may not approve of the goals of the Iranian government, but you've got to admit they're persistent in the face of overwhelming opposition,
Will we be getting a more in-depth look into the political and economic structure of the NAC?
Yes, including a very drastic failure mode (in "Invisible Sun"). Clue: fledgling democracies with autocratic antecedents are prone to coups, although the outcome is highly sensitive to contingency.
and they were faster than anyone except Great Britain in abolishing it. Utter total cobblers. Try reading this, FIRST Talk about vain US self-important preening!
How about Britain from 1688 onwards?
Tell that to the Irish in 1845-52. Or the Scots in 1715/45. Or anyone who was prosecuted under the Bloody Code (which makes present-day Iran look liberal and enlightened).
... okay, there are a lot of reasons our Constitution sucks, Exhibits A and B being "the Electoral College" and "the Senate" but none of these three rise to that level.
All three of your examples are failures of the populace, not the system. The system is only as good as the people working it, especially in a democratic society where the system gains its legitimacy by supposedly representing and implementing the views and standpoints of those it governs and providing a peaceful method of transferring that legitimacy to another political coalition when the current ruling one loses the confidence of said people.
Someone like Trump being possible and the ERA failing to pass are the result of the country being filled up with enough evil fuckers to wield political power. That's basically not a surmountable problem in any system that wants to be democratically legitimate, as opposed to just pretending to be democratically legitimate. The Civil War, of course, was an extra-political act and could easily have occurred under any other plausibly alternative system. Indeed, a large proximate cause of it was that the south realized the Constitution did nothing to protect their favored ideology from being dismantled, that is, it was far too anti-slavery for their taste.
See also Monmouth's Rebellion ( failed ) compared to William-&-Mary's erm, "hostile takeover" which succeded only 2 years later .....
Charlie, I said 'world-walkers'. The Commonwealth + the world-walkers can put a nuke on every US city and major base quite easily - with trucks, bombers not needed.
However, the Commonwealth military would be well aware of the fact that the USA could and would strike back with a couple of gigatons.
The question is leverage by the USA against governments. For example, the USA could give a serious hand up to the French Empire.
In my friend's opinion, the major goal of thE USA would be to peel apart the government from the world-walkers. Miriam admitting that there was a succession crisis was ill-advised, IMHO.
Well, IMO AIUI the issue with the Electoral College is that some states pro-rata their EC votes according to the percentage each candidate polls; others say that candidate1 won in our state and cast all their EC votes for candidate1.
Charlie, I said 'world-walkers'. The Commonwealth + the world-walkers can put a nuke on every US city and major base quite easily - with trucks, bombers not needed.
However, the Commonwealth military would be well aware of the fact that the USA could and would strike back with a couple of gigatons.
In the strategy of world-walking nuclear war the best deterrent is to display that you have an almost-possible-to-stop second strike capability. That means you need to have some sort of facility that can launch nuclear weapons through timelines and be safely doppelgangered in each of them. Sure you could strap an ARMBAND to an ICBM, launch it at your own city and translate in time to hit theirs but the silo itself is at risk unless you doppleganger it. I've always thought one issue with the DG strategy is that its infinitely recursive, each timeline's protective fortress is vulnerable to timelines without a protective fortress.
Which is where a world-walking Strategic Defence Initiative could come in. It would be very hard to strike at a space station/satellite from another timeline. Even if intelligence had given you its orbit jumping something in at the right time to intercept would be very, very difficult. Even more so if the target was periodically shifting inclination slightly.
If the NAC demonstrate a strong second strike capability then the US likely will be paused on that front and go down the sponsored coup route.
This seems like a bit of a two-step, Charlie; Hugh Fisher brings up outcomes, and you bring up resilience, Greg Tingey brings up resilience, and you bring up outcomes.
Congratulations, you just second-guessed where a chunk of the plot goes in book 3.
i>Some of us attempted to steer in a direction that avoided the disasters we could foresee (and which came to pass), but the engineers and mathematicians lost out to the marketeers and demagogues.
If you take the Internet as an example most of the bad directions taken have to do with the nerds implementing as if no one "bad" would every want to try and use their results for "evil".
Marketers were just a layer on top of the hopelessly idealistic mess, err wonderful new thing, that resulted.
The biggest problem is that gerrymandering has been allowed to get out of control. States are so gerrymandered that it is basically mathematically impossible for Democrats to get elected. I think this has allowed the fringe element to take control because the people getting elected don't have to be moderates to appeal to moderate/liberals/conservatives, they just need to appeal to "enough" bat-s**t crazies and not upset too many hard-core, I'll-always-vote-republican, voters to get elected.
The best thing the remaining sane people in the republican party could do is re-balance the voting districts, that way the crazies would be in the minority again.
"It's not terribly clear in "Empire Games" but the VTOL rotorcraft..."
Oh, I thought it was crystal clear as soon as it made its appearance :)
"Electric cars ... I'm not sure, but I think the cost lies mostly in developing high capacity/fast charge Lithium batteries that don't deflagrate under the driver's seat."
Yes. That's the only bit we haven't had for more than a hundred years. (Can delete "lithium", but the rest is true.) Hence the low-key but long-term success of the milk float and the electric forklift, where the problems of existing batteries don't really matter.
It isn't really a matter of path dependency in that case. When cars were just coming in we tried everything we knew - IC engines, electricity, and steam. It was not clear that IC engines were going to win out until we'd done quite a bit of trying - steam cars got to 120mph when IC cars were still trundling; electric cars were much more reliable than the rather crappy IC engines of the time, and range was less of a concern when people had not acquired the habit of using it.
I'll be blunt: I strongly doubt that Our Gracious Host originally were thinking of the Empire as being that advanced but with uneven distribution: I think it's a retcon, and I'm 'fine with it', given the amount of time between original conception and the book, and given the plausible Gibsonian explanation.
I'm afraid that the intersection of Iris/Miriam with just the right Wolf Orchestra couple seems more fundamentally hinky to me, given the low frequencies of Wolf Orchestra membership—even in Cambridge, Ma. activist circles, sorry, U.S. right-wing loonies. I forget, was Iris in or a close student of Hjalmar[?]'s intelligence service, that might increase the odds that she'd know what to look-for. Still, I think it a stretch that I accept only for the sake of a good story, which so far it is. …also, being blunter than might be Xenia-appropriate, because I've always felt the series to be an attempt to write the deepest, most intelligent, airport techno-thriller, possible—successful, but with some artifacts of that genre, e.g. data-dumps of hardware specs, nigh-inevitably present.
You DID notice the caveats I attached to that statement? And, still a lot less bloody than what happened to the French, & all of the rest of Europe, 1792-1801...
I hate to say it, but the problem isn't the Constitution, it's the people. And maybe the fact that our news agencies print propaganda as fact.
Hmm... Big Man for life", so, "elected king/Queen?" Sounds like some of what happened under feudalism, or, for that matter, in a Certain Old Republic, yes, Princess Leia?
And its still here, as the "prison-industrial" system.
Are you saying that voting districts are different in state & national US elections, otherwise there would be do "Dems" elected anywhere, which is patently not the case. Or have I missed something?
[ Agreed that the US has a very bad gerrymandering problem, which no-ne seems to be addressing - though I would have thought that voter suppression, esp in "the South" might also have something to do with it? ]
The stong federal government can go either way - don't let ideology blind you.
Some of the early part of the USSR saw amazing experiments, crushed, of course, by Stalin. But one of the things I got from Mieville's October was that there was a huge soviet movement throughout Russia prior to the Revolution, that could well have taken the place of a populace used to democracy.
One more note about pre-Revolution Russia: it was, I have read, 90% agricultural. The Soviets had to industrialize a country, which took a good part of a century elsewhere, first gearing up to deal with the West's sanctions, then to deal with Hitler, then to deal with the West.... They had, what, 10? 15? years of reasonable peace, when they weren't preparing for being attacked, and recovering from it?
It'd be interesting to see the US's response to massive attacks from Canada and/or Mexico, with either as world powers.
Please also note that in the US, during WWI, when the railroads were squabbling, and failing to provide the required transportation, the fed established the Railroad Commission, that gave them orders. Worked, too. I'm sure folks here could think of other examples.
One that just hit, of course, was the strong federal gov't of the US, that rammed desegregation down the throats of the Jim Crow South. Was that bad?
Mmmm, how 'bout the US Civil War? I read the first Greene bolt-action rifles saw use then, and trains were massively important, as can be seen by all the attacks on trackage. And boy, did we have conscripts....
I'll be blunt: I strongly doubt that Our Gracious Host originally were thinking of the Empire as being that advanced but with uneven distribution:
Nope, go back to the original series and you'll see the clues scattered about. (Hint" corpuscular petard" is "atom bomb"; "cronosium" is "uranium". Miriam got to see electromechanical calculators with nixie tube displays on pretty much her first trip out in the Empire; airships, sure, but also turbine-powered trains of a type that were experimented with in the 1940s-1950s in our world.)
Again: the big house she bought in "The Bloodline Feud" had electrical lighting and servant bells.
What tripped her (and your) "Victoriana" (or steampunk) alarm was the costumery. Which is down to the price of clothing being much, much higher when you don't have a pool of really cheap low-wage offshore labour to do the sewing, not to mention cheap imported cotton (hint: the Slaveowners Treasonous Rebellion in the New British Empire kicked off a century earlier and went much, much worse for the planters). Conservativism in fashion trend adoption is inversely proportional to the cost of clothing—there's a reason why, prior to the century that saw the invention of the cotton gin and the sewing machine, fashion changed on a roughly 75 year cycle (and today it's more like 7.5 weeks, in those countries that don't enforce religious dress codes).
Broader observation: the commenters on this blog are predominantly male. This means that culturally-male-gendered activities and interests get focussed on, but "feminine" interests (clothing fashion, for example: also cultural activities and social structures) tend to get overlooked, ignored, or actively deprecated. This is a mistake, as witness the chunk of Miriam's speech to the party congress on the subject of washing machines (which I kind of cribbed off the late Hans Rosling's Gapminder Foundation).
the intersection of Iris/Miriam with just the right Wolf Orchestra couple seems more fundamentally hinky
I'll concede that point. As someone-or-other once said, "in any novel the author is allowed one gigantic screaming coincidence". Yes, Iris' brother was merely the head of the Clan's security service, but that point is the big-ass coincidence that book 1 depends on.
As much as I loathe Trump, the electoral college worked exactly as it was supposed to work: it penalized the candidate who ignored multiple "unimportant" states, which is exactly what the Hillary campaign did.
For those who didn't study civics in the U.S., the idea is that a presidential candidate should act like they intend to be the president of the whole country, including the parts that don't like the candidate, and even if the candidates don't need that state's delegates to win. That means the candidate should go (for example) to Wisconson and Indiana and check in, and learn/care about the local problems. The candidate is expected to do this no matter how the candidate feels about those states or how they feel about him/her. The electoral college enforces this.
The awful coda to the whole thing is that Bill Clinton (who might just be good at politics) told the campaign they needed to work the Rust Belt, and he was shouted down by the wonks, who told him that their numbers didn't show any such need.* If you want to know how good a president Hillary would have been,** she was probably called upon to make a final decision between the wonks and the very politically experienced Bill Clinton, and notice which side she chose.
Obviously the framers of the Constitution didn't imagine Donald Trump as a candidate.
** Much better than Trump, obviously, but probably not nearly as good as Obama, Clinton, Carter or Johnson, IMHO.
Not realllly. It’s more like if the US had one Supreme Court judge who was appointed by an elected committee once the last one dies (to compare to Iran...but there’s a whole more complexity there like two entirely separate militaries). That individual would hold a hell of a lot of power by being the ultimate abitier of what laws are constitutional or not, but isn’t necessarily the head of government (though I can’t remember if the NAC has a position like “First Magistar” who is the head of government under the head of state).
To paraphrase Madison, the success of a form of government can't be contingent on the populace's being angelic. I fault Bolshevism because it would never govern a New Soviet Man populace, and Randism because the proper answer to 'Who is John Galt?' is 'Noöne not in an axe-grinding romance novel.'.
A poem from East Germany, from someone with good reason not to be scared:
https://en.wikipedia.org/wiki/Die_Lösung
Trump just leveraged what the Republican party setup. See: https://www.nytimes.com/2017/10/03/us/politics/gerrymandering-supreme-court-wisconsin.html
(Or just google "Wisconsin Gerrymandering").
The Republicans hired a firm to come up with a re-districting plan that makes it practically impossible for the Democrats to win the state. It wouldn't matter how much Clinton campaigned, she would have still lost Wisconsin, even though she had by far won the popular vote.
No, the electoral college has been hacked.
While there is some truth in that, it's actually a secondary issue (though that might surprise you), and I was talking about much more basic and consequential mistakes.
More than Russia or sexism or email coverage or anything else, the campaign's decision to ignore the base (both in policies and in geographic location) was the strategic error that put Trump in power. Schumer phrased the plan as “For every blue-collar Democrat we lose in western Pennsylvania, we will pick up two moderate Republicans in the suburbs in Philadelphia, and you can repeat that in Ohio and Illinois and Wisconsin” and that is what they did and that is what doomed them.
Bernie would have won
the success of a form of government can't be contingent on the populace's being angelic.
I'd like to think that a government's success can be contingent on the populace of the country not being complete idiots, but my countrymen are proving me wrong. I'll be very interested to see how OGH handles the political issues in Dark State, particularly if he's cribbing the constitutional system from Iran. I should probably do a little reading, because I've always thought the Iranians were saner that propaganda made them out to be.
Fair enough: I had decoded the 'crepuscular petard' and 'light-kernel cronosium', but had forgot them. I wonder, though, that an empire with a gigantic military would still have so expensive cloth, and (unless I missed it) no treatment for consumption at all, not just (for example) out of reach for such as Erasmus.
I was giving a civics less for those of us who didn't go to school in the U.S., and DELIBERATELY AVOIDING all these other issues (not that I disagree, mind you) because I don't want to get too far away from discussing OGH's new/old book.
Sorry. A "civics lesson."
They had, what, 10? 15? years of reasonable peace, when they weren't preparing for being attacked, and recovering from it? Nowhere near that - an absolute maximum of EIGHT years. Post-revolution civil war ends approx 1922, Lenin dies in 1923, first big push towards terror in 1930 ( De-Kulakisation, esp. in Ukraine ), followed by Kirov's murder in 1934 (?)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Thoughts about parallel development in our world ( & "Empire Games", because htis is before the split ) : Washington & his slaveowning friends would have lost without massive & expensive French assistance. Which then backfired on the French, because the war-bills came in ... French royalist guvmint effectively goes bust & has to summon the estates-general in ... 1789. Um, err ....
"Broader Observation" THANK YOU Charlie ... My father taught me, very early on ... we were watching an old film on TV ( !! ) & he told me to look for two things which were an instant clue to the actual date of production, & that it also appled to "still pictures, of course\; - one was models of cars & road transport, but he also said "Watch the women's fashions, they're a dead give-away." I recently used this to spot that a supposedly-dated picture of a London street in approx 1912 had to be post-1920, because of cloche hats visible in the street!
One, it did nothing of the sort.
Two, while you are right about the fact that the Electoral College did what it is supposed to do in 2016, this is because what it is supposed to do, what it was conceived and implemented for, is an explicitly anti-democratic and authoritarian purpose: to represent the interests of a wealthy minority over the masses, to greatly weight things in favor of land instead of people, and to null out the popular winner if necessary.
It has nothing to do with penalizing candidates who ignore multiple "unimportant" states. Its design doesn't do that at all, among other things. It penalizing candidates who ignore swing states who control the balance of power. That's it.
Candidates are expected to do this, but for the most part they do it as little as possible (Clinton and Trump both only went to California and Texas to fundraise, for example, rather than to out-and-out campaign) and the Electoral College in no way enforces this. I mean, for fucks sake. Trump absolutely did not act in any way, shape, or form like he was running to be president of the whole country and the Electoral College came down on his side, for entirely unrelated reasons.
And they were absolutely correct to do so.
First of all, by "Rust Belt" you really mean "Michigan and Wisconsin." The Clinton campaign spent a ton, I mean an absolutely ton, of time and money in Pennsylvania.
Second of all, you realize your argument is basically anti-empirical and anti-intellectual, right? I've seen it come up time after time after time since the election and it's all post-hoc justifications.
Let's game this out. It's late August, 2016. The Clinton campaign is having a meeting to determine where they're going to spend the next two months, the height of the campaign season. All of the advisors are here. You're here. You say, "Madame Secretary, we need to invest a lot more time, money, and presence in Michigan and Wisconsin. We're vulnerable there."
Across the table, another advisor stares at you. "According to who?" they ask, incredulously. "We are consistently safely up in those states according to the polling. And not just our own internal polling, either. CNN has us safely up. So do NBC, MSNBC, and even Fox News. Let me reiterate, FOX thinks we're safe there. So does Rasmussem, which always has something like a five-point Republican bias. So does Gallup. So does literally every other major polling outfit in the nation. Hell, from what we can discern, the Trump campaigns own internal polling shows that they've got no chance there, which is why Trump isn't planning to spend any time there and why they aren't devoting resources to it. Given all those facts, why the hell should we devote more time and resources to Michigan and Wisconsin than we are? We need those resources, and our candidate, in Pennsylvania. Florida. North Carolina. Virginia. Colorado. Nevada. Those are the purple swing states and we need'em. What empirical, fact-based evidence do you have we should pull resources from them and put them into Michigan and Wisconsin, which, again, even our enemies believe is safe for us?"
They pauses in their increasingly-frustrated speech and take a sip of water. Hillary Clinton nods approvingly, and then turns back to you.
What rebuttal do you have to this that you can construct using only the information available in late August, 2016?
Sure they did. Trump isn't even the first of his kind. He's basically the second go-round of the Jackson administration.
Yep. With the exception of certain classics — the original VW bug, Land Rover, 2CV, Porsche 911 — automotive styling has been fashion-driven ever since the late 1940s, hitting its peak circa 1950-1970 with "built-in obsolescence", in which the designs changed every year or two to drive sales to folks who wanted to be seen to be driving a new car.
(The UK got less of this post-1966 or so thanks to the annual and then semi-annual number plate letter sequences, but it's still noticeable ... and an automobile today can cost an equivalent proportion of an average annual disposable income to a well-tailored suit of clothes in the pre-1800 era.)
MODERATION NOTICE
Further comments on the Trump campaign and ongoing investigation are banned on this thread, and will be deleted. They're derailing.
(If things heat up I might spin up a separate discussion as we seem already to be hitting Watergate-equivalent levels of WTFery in DC.)
Re: the fault is with the electorate, not the system
The system was engineered in order to best identify the electorate's needs as the country grew and its population evolved/matured. Hence the provision for amendments. The current interpretation and use of 'the system' seems to be almost entirely to ensure that the folks who had been initially preferred would continue to have their preferences met regardless of how the country/population changed. It's like insisting that baby formula be the diet throughout a human's life because it was the optimal diet when they were first born. Totally nuts and unhealthy.
As someone else mentioned, this continual referencing of the original make-up of the early Americans by insisting on the same voting outcomes is in reality the by now tried and proven way of getting away with just pretending to be democratically legitimate.
I would have thought that an "eat your own dogfood" rule would come under extreme pressure when there is a need to do things quickly. If you need a high speed rail line, you can either develop your own steel grades, processing methods and rail profiles, or you can just buy a copy of BS EN 13674 which will have it all in for you, but in a strictly metric unit system. 13674 encapsulates knowledge that has taken almost 2 centuries to acquire, the temptation to just use that and move on would be extreme. But once you have done that, you will be tied into entire set of standards through all the cross-referencing they contain. So EN13674 will contain references to standards for NDT & methods for steel chemistry assessment, for instance, and they will link to other standards and so on...
The knowledge is compact, and easily legally accessible through a subscription to one of the standards organizations. The arguments over direct implementation or dog-fooding would be interesting.
How about Kondratiev waves in the presence of extremely external forcing functions?
Pretend that Kondratiev waves are somewhat real and that in OTL we're slightly over the hump of the 5th one (information technology). One could ask what comes next in OTL (AI, CRISPR?) but the relevant question here is what about the NAC, where development has been massively messed with by Miriam and will doubtless continue to experience paratemporal perturbations?
https://en.wikipedia.org/wiki/Kondratiev_wave
The knowledge is compact, and easily legally accessible through a subscription to one of the standards organizations.
IIRC, part of Miriam's program was to have lots and lots of CDs containing TL 2 standards, patents etc. shipped over to TL 3.
(I hope I'm keeping the TLs straight.)
no treatment for consumption at all
Maybe none beyond traditional remedies. Especially in times of rapid development, I don't think it's unreasonable to posit that the medical arts lagged the mechanical ones.
In one of Piper's Paratime stories there is mention of a timeline in which something like that happened: spaceflight developed swiftly, but medicine not so much. And then they went to Venus(*) and came back with a really bad bug, with civilization-destroying consequences.
(*) That was before we knew what Venus is really like.
With respect to dogfooding, it'd be interesting to know how much accidental cross-contamination came across because "work it out for yourself" doesn't necessarily mean you need to use a slide-rule when there's a pallet of HP-48s in the next room when you're designing your rockets and nukes.
This sort of ancillary tooling usage could easily lead to things like C or FORTRAN or ASCII spreading between timelines. On the other hand, I'd love to see the NAC run on Forth or Lisp rather than C.
Don't recall whether you've already discussed this but since you're addressing impact of demographics on a society ...
Pathogens impact the demographic profile of any nation to the same extent as wars, famine, natural disasters, or sophistication of business and military infrastructure.
Human population movements include migrations of pathogens. So although the new worlds now have CRISPR tech, you'd still need a truckload of human samples from each world plus well-equipped bio labs before you could correctly screen for pathogens let alone develop and dispense appropriate vaccines. Wondering how anyone would recruit microbiologists to these labs given the risk of setting off a global pandemic.
Miriam was a tech journalist. I doubt she'd allow anything like cross-timeline computer-language contamination to happen and she probably pushed a different chip technology... RISC instead of CISC maybe?
https://www.youtube.com/watch?v=VP9wkTEgORI
Interesting tidbit here for Man in the High Castle. (PK Dick novel, now an Amazon Prime series, Axis won WWII, USA divided between Japan and Germany. Major suppression effort to prevent dissemination of subversive novel about an Allied victory that sounds like our world.) I've not watched the show yet, wanted to give it some time and find out whether or not the fans are ticked at where the quality went. But this tidbit came up as a teaser. Nazis have discovered "travellers" and are now working on a mechanical means of travel to other worlds.
Makes me wonder if they were directly influenced by the Family Trade or if they're just drawing water from the same memetic well. It's funny how similar ideas will crop up, the same two or three ideas bump together in one writer's head and repeat in another's without any direct influence. Sort of the same way JK Rowling and Neil Gaiman both came up with a brown-haired, bespectacled boy-wizard with a pet owl. It's all there in the soup.
The Guardian notes our interest the elder gods: https://www.theguardian.com/science/2017/oct/31/searching-for-the-old-ones-lovecraftian-giant-cephalopods-fossil-record
Creatures from the deep have been inspiring fiction authors for generations from H.P.Lovecraft’s Greater Old Ones through to China Miéville’s Kraken (in which the Natural History Museum giant squid specimen has a role to play). Known for their intelligence there may be some substance to ancient brooding organisms living in the depths of the ocean but fortunately, it’s only their beaks and gladii they leave behind.
... or that's what they want us to think.
I want to believe that they're making something like the 6502. Also our world is basically awash in RISC cores I'm sure the tech censor made sure they didn't make the CISC mistake. Even Intel's new stuff is RISC, they just have a hardware transpiler that turns a weird CISC language into machine code.
'MITI enforces an "eat your own dogfood" rule' Is their name a shoutout to these guys?
I hope Miriam also brought them some books about the Meiji Restoration.
Not quite, but it's complicated... Only the simplest chips are actually RISC at heart. Intel and AMD desktop and laptop CPUs are a mix of technologies with no clear RISC/CISC differentiation in the silicon and to quote XKCD, "there's a lot of cacheing involved."
The big tech win in CPU design was ever-smarter branch prediction (and cacheing), the RISC vs. CISC battle was eventually a draw sort-of.
Anyone starting from scratch to design CPUs that are useful for general-purpose computing will go CISC since a complex instruction that takes 8 cycles at 1MHz will take four RISC ops in 20 cycles at the same clock rate along with instruction fetches and RAM accesses at similar clock speeds. RISC is initially for the chips that can be integrated into doorknobs at two dollars a pop where performance doesn't really matter. Further along the line as transistor counts get into nine or ten figures and the caches multiply the RISC/CISC boundary gets fuzzy.
The instruction set remains the same though as the historical software burden necessitates stability in that particular API. I have a graphics package, Corel Draw I run on this 3GHz quad-core CPU in front of me. I first ran the same executable image on a Pentium II machine back in 2002 and I know it would have run on a 386 even without a maths co-processor (the "About" tag of the program, Corel DRAW reports it was released in 1998). Changing the instruction set of the CPU would mean that software investment would be lost without complex and performance-sapping software emulation.
Or tried to build Project ORION. But there's always a first time ...
I caught the bits about the "corpuscular petard," etc., in the first series, but Project Orion slipped by me... was it the rocket project in the Southern Hemisphere?
"We'd love to have friendly and happy relations with the reality next door. But first, let's show you what happens if you violate your treaty terms..."
Changing the instruction set of the CPU would mean that software investment would be lost without complex and performance-sapping software emulation.
Which is very often a price worth paying. At the trivial level there are emulators for old microcomputers and video games that run in all sorts of places - even browsers - because the gulf between what's common now and what the setup requires is so vast.
At a more useful level, on some embedded machines "backwards compatible" involves emulation either in hardware, microcode, or sometimes in the OS. We have 32 bit embedded processors where you can install an OS image that will emulate the 8 bit predecessor (the 16 bit ones are mostly a simple cross-compile). Sometimes you just gotta wear the footprint hit because the 8 bit code is a ball of hand-coded assembly but you still need the functionality. Reporting bugs on this stuff can be a nightmare, but also revealing. Firmware bugs usually get fixed faster than microcode ones and hardware (silicon) bugs even more slowly. Years, in many cases.
Even more prosaically, there are legacy serial emulators in many modern devices, and some of them are a bit nuts. We have a USB+Wifi board that has a 32 bit micro on it that among other things emulates a 16550 serial chip/interface. Just in case you have a physical TTY you want to hook up (actually, many things use that interface).
Which reminds me: always ask how you update the firmware.
This came up when a friend was looking at an IoT sex toy, and wanted advice. The simplest I could boil it down to was the above. If it connects to the internet, it is inevitable that eventually a bug will be discovered. At that point you either update the firmware or throw the thing away. Well, or just accept that your device is now (at best) part of a botnet, but is likely also providing everything it knows to anyone who asks. If you're unlucky it will have known vulnerabilities or be hacked right out of the box. "zero day" in a different way :)
"The argument from ignorance isn't extra convincing when you're extra ignorant".
Have been banging my head against "but I don't know that it can't work" type arguments a lot recently. Saying "then quit bothering people who do know and do whatever it takes to find out" doesn't seem to work.
There was a mention of the delivery of the latest batch of pits at one point.
Well, it's a very useful interface, because it's so bloody simple. You can implement it with a trivial amount of software, or even with no software at all and no CPU, just a few gates and flip-flops. With USB on the other hand you need something more than a common 8-bit CPU just to run the stack, let alone do anything else at the same time, which in technical language is known as a pain in the arse.
(Though USB itself seems to occupy a somewhat similar position higher up the complexity scale, so instead of a 16550 with a PCI interface, you get a 16550 emulator with USB on the other side, hard-wired to a USB-to-PCI interface chip, etc.)
The Commonwealth has to deal with the situation that on the one hand, they want to get up to speed as rapidly as possible, for which purpose it would be distinctly crippling not to be able to take advantage of the vast amount of software already in existence in "our" timeline, but on the other hand so much of that software is written for a CPU architecture that should have been abandoned 30 years ago and really isn't the sort of millstone you want to tie around your own neck. And on the third hand a huge amount of current software is mere bloat plus you probably don't want to use anything released after the existence of the Commonwealth becomes fully known to the US. So it might well make sense for them to go for a simplified architecture plus partial emulation - a CPU that can execute the common/simple x86 instructions natively at full speed, but keeps itself structurally simple by not attempting to execute the more elaborate/rare instructions directly, instead interpreting them as traps to a sequence of native instructions that emulate the relevant function.
The Civil War certainly used trains and conscripts but the first war to leverage rapid pre planned mass mobilization plus rail transport as a first strike weapon was probably the Franco Prussian War
Civil War also featured repeating lever action rifles (the Henry) with relatively large magazines
Don't think about bringing UNIX or Windows over. But if you have FORTRAN, you can grab LAPACK and BLAS and essentially import a few PhD programmer-centuries of work on making math fast.
I think the way to do it is to bring in laptops with Slackware installed (because Slackware is the conceptually-simplest Linux that really works*) and also bring in a few of the best critiques of C. So the idea is to start with C (plus assembler) and develop a native language without C's baggage. You end up with a native version of Rust, Go, ADA or something similar, plus an emphasis on security. This effort is run in parallel with your chip-building and the idea is to build the simplest chip(s) that will run something UNIX-like.
Then rebuild the UNIX kernel and utilities with the new language, (but not the shells or editors - these should be original to your own plane of existence.) The utilities should all get new names and different modes of presenting information. All your high-level programs should be original, with vi and emacs going away first. (In fact, if someone builds vi, emacs, or BASH in the new language, you fucking kill them.)
The next step is to build a really secure network, with encryption built in at the chip level, which can accept firmware upgrades, (you're probably building out your phone network at the same time, which helps) and it is against the law for serial or network ports to be on the motherboard. They are always on a daughterboard, which makes hardware upgrades easy. If you want to get really paranoid you arrange for the chips to randomly negotiate pinouts, voltages, etc., on the fly, just to make for really interesting standards. Then you deprecate standards every couple of years. (And for Bob's sake, build an IP addressing scheme that allows you to simply add an octet if you need it! GRRRRR!)
When your first computers come out they've got their own chips, their own language, their own operating system, and incredible security. The first thing you do is install one at every college or university and start "Model Railroad Clubs." Everything is Open Source, and "Security Fights" (hacking) is a sport with results reported in the newspaper, plus a government rewards program for discovering security holes. Lastly, you develop an HTML-like language for addressing the screen, because hopefully a web-like thing is coming soon, so why build an extra layer?
What made the most lasting impression on me in Palimpsest was the strategy of using the most downtrodden humans as the seeds of new populations. Pick the survivors, not the "winners."
"Never bring a fact to a derp fight." Bitter wisdom of this age.
Anyone starting from scratch to design CPUs that are useful for general-purpose computing will go CISC since a complex instruction that takes 8 cycles at 1MHz will take four RISC ops in 20 cycles at the same clock rate along with instruction fetches and RAM accesses at similar clock speeds. RISC is initially for the chips that can be integrated into doorknobs at two dollars a pop where performance doesn't really matter. Further along the line as transistor counts get into nine or ten figures and the caches multiply the RISC/CISC boundary gets fuzzy.
There's a large number of folks who design semiconductors who would totally disagree. They, and my, opinion on all of this is CISC, err, Intel won due to coming up with a chip set that got put into a device which sold first in the 10s of thousands then into the 100s of millions. Which that much profit that can plow back into R&D you can paper over a lot of inefficiencies. ARM took off because of phones which were another market in millions and now billions. With that much money you can really work hard on fixing bad decisions made in the past.
RISC never got to the numbers and thus never had the R&D funding to deal with rolling out next gen CPUs on an annual basis.
But I've noticed that you and I almost never agree on tech so I'll let this drop now.
"And if you're going to say that the Commonwealth is a special case because it's preparing for war, again no not necessary. The USA out produced everybody else in WW2,"
I think you're trying to pull a fast one there.
Because in WW2 the USA had exactly the sort of strong govt control of a highly regulated economy that you're saying isn't needed.
Rationing, remember? Ration Boards, War Production Boards? A production board in D.C. deciding in 1942 that cars shall not be built in the USA for a few years because the govt needs the factories for something else?
Because the USA in WW2 was not the USA after the revolution, with no tradition of democratic control. There was a slow evolution from a Jeffersonian ideal of a toothless federal govt with no power, to the post-New Deal 20th Century federal govt with vast power. But in doing so the USA developed the democratic traditions around transfer of power that lets you control that monster. A new revolutionary govt in a nation with no democratic tradition doesn't have that culture.
A lot of that change was done during WWII. The administration just did it and Congress mostly went along.
Apologies if this has been dealt with before, but are Annie's and not-then-Erasmus's kids ever getting dealt with? Is it sort of assumed that the orphanage policy in Australia would have killed them off? Or did they get retconned out after the edition I read/get binned as uninteresting? It just seems that a NAC Commissioner would have the pull to look into his essentially stolen children, and him not doing so would be an interesting character development.
CISC and a lot of really neat ideas for general-purpose commodity CPUs like branch prediction and out-of-order processing will work fine if you've got 20nm-resolution silicon exposure plants from day one so you can put billions of transistors on a die that costs a few bucks to manufacture[1]. Start as Intel back in 1974 did with the 8080A using a 6000nm technology the best they could manage was 6,000 transistors for simple instruction decoding using ROMs at about 2MHz consuming 1.3W. Emulation, cacheing, all the really neat stuff we take for granted today takes lots of transistors and power saving from other silicon tech -- an 8-core Ryzen CPU has 4.8 billion transistors at 95W dissipation running everything at 3GHz plus.
Unless the worldwalkers can obtain, transport, set up, maintain, supply, staff and operate a modern silicon line to produce 14-20nm tech chips en masse in their own worldline then they're going to be stuck in the flint knives and bearskins world of 8080A and maybe 68000 (so named because it had 68,000 transistors) tech from day one. If they're only importing knowledge and building silicon lines themselves they're a bit like Cody or the Wright brothers knowing that somewhere else aircraft can do Mach 3 at 25km altitude but they're stuck with wire and doped cotton over spruce struts to build their own planes out of. The good thing is the world-walkers can import a lot of design tools and expertise to make much better versions of simple CPUs locally but it's still likely they will be x86-compatible at the silicon level until their resolution gets good enough to do it in software on-chip because of the available software they can buy off the shelf in our world.
[1] I read one Intel engineer's report of the 8080A rollout. From memory, "The first chip cost us a million bucks. The second one cost fifty cents."
Yes That's wierd. The Henry repeating-rifle was invented & used in the US during their southern treasonous rebellion, BUT ... why didn't they "simply" mass-produce it for most troops, rather than allowing vast numbers of conscripts & volunteers to be killed by the rebels, because they were still using slow-fire muzzle-loaders? Very curious, that.
"If you want to your society to progress - and not just technically, but also socially - a "strong central authority"is the last thing you want."
I agree. But I only agree over the very long term.
(Also, there's a big middle ground between the "very weak" central authority the US had in 1790 and a "strong" central authority, but lets gloss over that because this is interesting...).
Japan, Singapore, China, the Soviets up until 1960, Nazi Germany - they all had economic booms in highly controlled market economies, with strong central government. US during WW2 as well.
Krugman had an interesting article on this a couple of decades ago that I can't now find. He pointed out that central authorities - the govts pushing the "Asian tiger" economies of the 90s, the Soviets of the 50s - are very good at mobilizing resources. They can get capital going where they want, labour going where they want. I think his example was Soviets mobilizing capital so that instead of nobles having gilt plates and peasants digging with shovels, the gilt plates were sold off and the peasants got tractors. That sort of thing makes a huge difference. And has flow-on effects.
But what they tend to be bad at is productivity growth. Getting people working "smarter not harder". Which in the long term is what really matters - but only in the long term.
Which is where the "transition to market economy" part of development economics comes in. Once the govt has interfered to pull the economy up by the bootstraps, it needs to stop interfering.
But that's what govts are worst at - stopping interfering. Stopping supporting industries that were supported in generations past. The way the US govt subsidizes the ranching, farming and the fossil fuel industries in the 21st century are very typical examples of that.
Which gets back to: any revolution has to make it to the long-term, and there are some hard transitions to survive on the way.
Yes / no / maybe What Adam Smith actually said .... That guvmint should what we would now call "Pump-Prime" investment & technological advance & then retreat & allow the "invisible hand" to take over the ehavy-lifting, whilst reaping a share of the profits in increased tax revenues from the greater overall wealth.
[ As opposed to the bollocks talked by the "Adam Smith Society" who have all-too-plainly never read his work. ]
"All you need is a (1949-vintage) B-36 Peacemaker bomber-equivalent"
If you're happy to convert an existing passenger liner, then world walkers stealing planes would be amazingly easy. Book a ticket on the plane you fancy, When airborne and out of range of communications, just world walk, taking the aircraft with you. (MH370?) You also get two trained pilots (and a bunch of passengers that you don't really need, but them's the breaks). You'd probably also need a radio on the ground to let the pilots know what the situation was and where they could land.
One could ask what comes next in OTL (AI, CRISPR?) but the relevant question here is what about the NAC, where development has been massively messed with by Miriam and will doubtless continue to experience paratemporal perturbations?
Kondratiev waves are bunk in the context of a rapidly-developing economy. Apply them to Japan 1860-1910, for example, or South Korea 1970-2000, and they break. Or China, circa 1980-the present. (China in 1985 was still basically third world. China today? In summer I can't walk down my local high street for tripping over Chinese (mainland) tourist groups. If you can afford package holidays on another continent by airliner, then I submit you're probably not a subsistence peasant farmer in an undeveloped nation ...)
(I hope I'm keeping the TLs straight.)
You will be unsurprised to learn that I had trouble with that as well.
In the original draft, I gave the Commonwealth and the USA different and (to them) correct timeline numbering schemes, so the Commonwealth people called theirs time line three (because they inherited their numbering scheme from the Clan), and the US called the Commonwealth something like time-line 2-136-indirect (TL 2 being the Gruinmarkt, 136-indirect being the 136th timeline indirectly reached via time line 2).
Needless to say, this confused the fuck out of me, my editors, and all my test readers, so badly that it was a total train-wreck. So I took a deep breath, and renumbered everything in terms of the Clan nomenclature Miriam used in the first series, including in dialog ... and none of my readers have called me on it so far! WIKTORY!
Note that consumption (tuberculosis) killed up to 30% of the population of Victorian England. And while we have a vaccine and drugs for it today, MDR-TB keeps out-evolving our treatments. The real decline in TB in the UK coincided with a couple of factors: (a) the rise of the automobile and the corresponding fall of horse-drawn urban transit (meaning: decline of urban stables), (b) pasteurization of milk (hint: the bovine TB reservoir), (c) rise of indoor gas/electric/central heating (reduced chill and humidity), and (d) the Clean Air Act (ban on burning coal in urban areas, which also got rid of the smog).
The New British Empire was still mostly coal-powered (hint: late development) so smoggy as hell, and was still using horses widely for urban transport as late as the 1970s (equivalent of the 1890s in the USA). Hence the prevalence of TB.
This sort of ancillary tooling usage could easily lead to things like C or FORTRAN or ASCII spreading between timelines. On the other hand, I'd love to see the NAC run on Forth or Lisp rather than C.
You called it right there. But the point is, C and FORTRAN and ASCII are 1970s/1950s/1960s technologies. By 2010-2020 we've had 40/70/60 years to identify their flaws and weaknesses. We are stuck with them because we have billions of lines of code to maintain. The Commonwealth has the luxury of using them as teaching opportunities for their first crop of CS students, to explain stuff like why null-terminated strings and manual garbage collection are a bad language design decision.
Meanwhile, there's enough prior art to show why Forth and Lisp are extremely useful in their respective roles, and if you want to give kids an 8-bit micro with a programming environment to train on (think Commodore-64/Sinclair Spectrum equivalent) you want to give them one with a rationalized version of Forth in ROM and a basic Lisp-like language as a tape-loadable "this is what real programmers use" environment, rather than garbage like early BASIC. (Don't tell me this is impossible: it nearly happened in our time line. Personally I blame Bill Gates!)
SISAL! The number crunching language of champions! :)
OK, it was a very promising experiment that got killed by the FORTRAN lobby. Nobody really knows how it would have panned out if it had more money thrown at it, but that's the point right?
This still doesn't preclude the possibility that the GCC or YACC you've brought with your Slackware is compromised along Ken Thompson's line on trusting trust. You would really need to be making any compilers from scratch on your novel architecture's machine language, learning from but not actually re-using prior art on the topic.
[I]f you want to give kids an 8-bit micro with a programming environment to train on (think Commodore-64/Sinclair Spectrum equivalent) you want to give them one with a rationalized version of Forth in ROM and a basic Lisp-like language as a tape-loadable "this is what real programmers use" environment, rather than garbage like early BASIC.
Lately I've been reading Commodore 64 and Commodore 128 documentation and programming books from their era, because I'm interested and we have a C=128 lying around in the cellar - could be fun to code something on it, and perhaps show the kids what we had to put up with (for about two minutes until they get bored). There's of course the BASIC, which was quite horrible on the C=64 and less horrible on the C=128 (commands for graphics and sound in the BASIC!), but many books about programming these venerable computers go from BASIC to Assembler.
This is not an improvement. The computers had other languages, of course, and I really have no idea how much they were used in the real world, but it seems to me that much effort was put in writing books about how to program these computers in Assembler. That just teaches even worse tricks than BASIC and even more horribly, many of the tricks are applicable to only that hardware. There are eldritch horrors like self-modifying code and switchable RAM and ROM banks and all kinds of stuff I'm happy nowadays the operating system protects me from with many, many layers.
What the world could've been if those would have had something else easily usable than the BASIC and Assembler! C=128 even had a built-in monitor program for brain-dead coders (it wasn't a symbolic assembler, even).
That's pretty much the sequence.
Start with a simple 8-bit instruction set like the 6502. Use this for your first discrete transistor minicomputers.
As you get the ability to manufacture gate-level ICs, you roll out a 16-bit extended architecture — think early ARM 1 — and build mainframes/minis around it. A bit later, you manually tape out your first 8-bit CPU for mass production, like the 6502. (This is your 1974-equivalent year.) This coincides with LSI showing up and going into the 32-bit version of the architecture.
The 8-bit CPU goes into the "home computers" which are also intended to take a modem and work as an online terminal for the 32-bit city-wide data processing utilities (think MULTICS as originally conceived). Because we're not stupid they cost a bit more but ship with floppy disks and monitors from the start — home cassette tapes and TV sets haven't had decades to get bedded in as consumer items in the Commonwealth yet, so you can't count on the users owning them. These "smart terminals" serve as office computers and educational tools as well as home computers: think in terms of the Amstrad PCW and BBC Model B. (The nearest US equivalent would be an Apple IIe, if it came bundled with build-in floppy disk drive and printer and monitor.)
The next tick of the clock is the 16-bit business/home micro which has a hard disk and runs a GUI front end and also provides VMs for 8-bit apps. (Again: this has been done with commercial success in palmtop form, by the Psion Series 3 operating system, EPOC/16.) Needless to say, it serves as a terminal for The Big Computers that our Cybersyn-like planning software runs on. But it can also run GUI apps, initially one at a time, a bit like a circa-1985 Mac (if the original Mac had arrow keys and could run Apple II software).
By the time we get to 32-bit VLSI microprocessors it should be pretty obvious where this is all going, right?
Other requirements: baked-in code and data memory separation.
Formal verification and proof of correctness of the 8-bit core instruction set and the 8-bit micro architecture — for this, the architects are allowed to go hog-wild with imported US verification tools, as long as they're completely air-gapped from the final product (the masks for which, like the 6502, are taped out by hand). We want a solid foundation, after all.
A network stack that supports end-to-end packet encryption (spiked by the NSA in the mid-80s with TCP, with consequences we are all wearily familiar with).
A hypertext protocol that is used for hypertext, not as a tunnel for god knows what bastardized reinvented-by-idiots version of RPC. Versions of RPC and DCE that mere mortals (like the sort of people who invent web apps with REST and SOAP) can understand and use.
Oh, and most importantly, a billing/subscription model to pay for commercial content delivery, instead of advertising (to try and prevent the spread of clickbait and malware at source).
Yes, of course it's a shout-out. (Remember, Miriam is married to the minister of propaganda? And, per Chomsky, in a democracy the tools of state propaganda have to be a whole lot more subtle than standing on a soapbox yelling "ALL GLORY TO THE HYPNOTOAD".)
Replying to both posts by Icehawk
Someone else pointed out that the absence of central authority gives you Somalia. Good point. So I will modify my position to that historically over-controlling has made things worse.
You're right that the USA stepped up central authority in WW2. But the point I was making is that Germany, which had more central authority over development and production, did much worse.I guess there's a balance point where you have enough but not too much.
Another way of looking at it is that central authority should say "Don't do that" whether it's social issues like discrimination against women in the workforce or technical issues such as using unencrypted protocols. But saying "this is what you should do" is less likely to be right, and discourages initiative.
This is not an improvement.
Though, yes, those 8-bit computers (and even 16-bit computers) taught my generation a lot about hardware. Even doing stuff on an 80386 was quite quickly an exercise in assembler and the interrupt lists available on BBSes were very useful. I never got into the VGA magic, but it was interesting to read about it later. Though I rarely need the hardware understanding in my current life, it comes in handy occasionally.
Kind of what Arduino can be used to teach nowadays. Even that uses something resebling C, though.
Trusting trust attacks can be mostly avoided by simply not implementing compilers in the language they compile.
e.g. running your C compiler on a LISP interpreter that in turn runs on a minimal FORTH like coded in a few hundred lines of well documented ASM on the bare metal isn't going to win any prizes for elegance or performance but there is nowhere for the Thompson hack to hide.
Still got to trust the hardware, but if you can't do that you are screwed anyway.
Greg, David L: your more recent off-topic comments have been deleted, per moderation notice in comment 97.
Missing, presumed dead. (This happens to quite a few minor name-check characters, and some less minor ones.)
You want to wait for "Dark State", which explores this issue in depth. Hint: it's feasible, but it's not as simple as it sounds at first.
We are stuck with them because we have billions of lines of code to maintain. The Commonwealth has the luxury of using them as teaching opportunities for their first crop of CS students, ...
And here we have the perfect recipe for a second system disaster. Redesign everything from the CPU up. What could possibly go wrong?
I don't see CPU architectures as important, it just won't matter if the architecture has been formally proven or not. Programming languages don't matter that much. Personal spreadsheets matter. Email matters. Facebook matters, if only as an example of what not to do. Making sure women and minorities don't get excluded matters.
Ridiculous idea of the day. Cheap hypersonic international travel using black hole gravity assists. You just need an aircraft with a completely sealed, vacuum ready cabin and very good timing.
For a worst case, imagine if the Commonwealth stole the schematics and source code and then insisted on running their Five Year Plans on home-brew LSI-11 hardware (good, but limited — there's a reason DEC sunsetted that architecture when they did—and there was an early Soviet home computer that did just that!) running MUMPS or maybe Pick. And rolled these out as CS teaching platforms, of course ...
Bingo.
Other stuff you don't want to risk importing: microprocessors with a secure enclave running some god-awful blob of unauditable software (hello, Intel!), microprocessors where the mask was compromised to incorporate a hidden secret processor with Ring 0 access to the actual hardware (you can lose an entire Pentium inside a modern chip with 3-4 billion components).
Once the USA realizes the Commonwealth exists they will prioritize finding and attacking any contraband-component supply chains they can identify, as a matter of extreme urgency. Their initial assumption will be that the backward Commonwealth runs on cloned/pirate software and hardware (like the USSR), but they'll wise up rapidly.
In fact, once the Commonwealth realize they've been rumbled, MITI needs to enforce the "eat your own dogfood — NO exceptions" rule rigidly, or they're going to be hit by an enemy who is about 50 years ahead of them in this particular field and who has shown no compunction about waging cyberwar against perceived opponents even during peacetime.
"our news agencies print propaganda as fact"
All mass market media does this.
Oh, God :-( This whole RISC/CISC religious war is a prime example of engineers and mathematicians being replaced by marketeers and demagogues. RISC was a justified reaction against the VAX and later 68Ks, but became a fanatical religion and dogma before it was ever implemented. Good behaviour (including good performance) comes from good design, nothing else, and so-called RISC designs are often MORE complicated (and often slower!) than many of their CISC predecessors. You don't seriously imagine that the designers of discrete-logic machines, when 48 KB was a lot of main memory and suitable memory for microcode was still in the future, put ANY complexity into their designs that they didn't need?
Furthermore, most of the more extreme performance features of modern designs (e.g. the complexity of caches, branch prediction and related horrors) were added almost entirely to cover up the deficiencies of modern software. C++ as she is spoke (NOT as Bjarne intended) is perhaps the worst, but a lot of the more 'modern' and 'better' languages are nearly as bad. In particular, the information that the programmer has about locality, control and data flow is very rarely expressible in the language, and even then is generally thrown away by the compiler, so the hardware is effectively trying to reverse engineer the software on the fly!
And then there's the security issue, but that's better for another post.
Programming languages don't matter that much. Personal spreadsheets matter. Email matters. Facebook matters, if only as an example of what not to do. Making sure women and minorities don't get excluded matters.
This is an issue of network externalities; and it's important not to be trapped in a sub-optimal position, e.g. using Microsoft Word as a default standard for editable business documents (which wastes huge amounts of time in tiny increments every day because its UI is a train wreck collision between two incompatible models for document markup).
Hint: there's a reason I used EPOC16 (aka SIBO) as an example earlier. Alas, the wikipedia entry on it mostly focusses on EPOC32 (later renamed Symbian) and contains errors. Let's just say, it was properly designed and later iterations of the Psion Series 3 crammed the entire pre-emptive multitasking OS and a full, serviceble office suite into 1Mb of ROM.
Yes. Security-through-obscurity doesn't work - it's how governments keep 'official secrets', after all, and we all know how that goes. Furthermore, every complication you add creates the potential for new loopholes and, worse, new and (obviously) unpredictable emergent properties. Modern computers already run programs that are larger than human DNA, based on designs that aren't as much more predictable or even reliable than is generally claimed, with nearly as many states as the human brain. Even worse than the security consequences are the RAS ones - I remember when we (in the IT industry) identified the cause of most bugs, even if we baulked at fixing them - but, now, almost all bugs are bypassed, described as 'features' or simply denied. And, while most of those bugs can be lived with, occasionally one completely prevents the system from working and it is impossible (or almost so) to find even the most inappropriate bypass! Remember when the air traffic control went offline for well over a day, or when National Westminster (if I recall) lost ALL their ability to trade for 2-3 days? The RAS of modern computers is little better than the best ones were in the 1970s, and a lot of the problems are emergent properties or close to it.
The solution is to pick up the successful work of the 1980s on capability machines and systems, where the hardware and software was designed to give mathematically proven security guarantees. That could be extended to programming languages and higher-level interfaces, but almost all of the IT world has been headed in the other direction since about 1980! In languages, Ada and, to some extent, Haskell, (modern) Fortran and a few others are worthy exceptions, but are not mainstream. Another example is the Internet protocols - OSI was a bureaucratic monstrosity, but there was at least an attempt to design it a whole, and it simply did not have many of the problems that bedevil us today.
I can't say that I am sanguine that another civilisation could do better in a short space of time, because the problem with all engineering is that 90% of the hard problems arise as you try to turn the high-level designs into blueprints. And it is absolutely fatal to override the doubters, on the grounds of lack of time - as every mathematician knows, if you can't prove it rigorously, don't assume it's true - and, even if you can, you still need to prove your proof.
I know that whatever happens to the King in Exile will not involve his restoration. However, I do wonder how the King in Exile thinks his restoration will happen.
He's seen that the NAC has greatly changed in the 17 years of his exile, into a polity that is overwhelmingly hostile to the Ancien Régime. How can he hope to be restored aside from a military invasion by the French (impractical due to nuclear weapons) or a well-timed coup (also unlikely to succeed without lots of covert help from within the NAC government)? Shouldn't he just resign himself to being Timeline 3's version of King Michael of Romania?
Of course, the capacity of people (and governments) to deceive themselves should never be underestimated. And who knows? Perhaps the King in Exile thinks he can put himself forward as the next First Man - not dissimilar to another fallen monarch of Timeline 2, Simeon II of Bulgaria, who also served as 48th Prime Minister of Bulgaria. Fat chance, IMO, but you never know.
~
It occurred to me just before I posted: Charlie is making direct comparisons between Iran and the NAC. Perhaps the ~USA will take a page from their own playbook regarding Iran: stage a coup and put the King in Exile in charge, just like they did with the Shah of Iran in 1953. There's no way that that could have negative consequences.
...to explain stuff like why null-terminated strings and manual garbage collection are a bad language design decision.
That's an interesting perspective, as I've spent most of my career in embedded systems (defence avionics, telecom, and consumer). The structures and idioms offered by C/C++ allow the use of both heap and stack; in both "manual" memory management and more automated approaches.
So, to counter your example, please explain how I avoid the garbage collector getting rid of that memory-mapped register object that's been sitting idle for a while? Do you know many Embedded Java programmers?
Yes, getting into the habit of memory management takes more effort, but it's also more efficient when you're limited in memory and clock speed (and makes you a more disciplined software engineer). Back in the 1990s I was doing avionics work in C on 12.5MHz and 25MHz SPARCs, and each processor had at most a megabyte of RAM. There was no disk, we boot-loaded all the code from EEPROM, and the only way to get the necessary processing done in the time available was single-threaded execution, and some really cheesy direct memory management. Then we started to see the first Mil-Spec PowerPC boards, and it was all VxWorks and C++. Yay for VxMP...
Suggesting that we trust a non-deterministic garbage collection algorithm to solve our memory problems in a hard real-time system would have resulted in a certain amount of laughter...
The nice thing about C/C++ is that it operates across most problem domains. Want to try making your eyes bleed with some template metaprogramming? Make the compiler whimper with some weird-ass inheritance schemes? Do some really simple "Hello, World" stuff for an 8-bit processor? Carry on. IMHO, it's not the weight of the past that sees us using C/C++; it's the flexibility. It's analogous to the success of the English language; not perfect, not always best, but "good enough" in nearly all cases.
C++ doesn't borrow concepts and idioms from other languages; it follows them down a dark alley and mugs them. Comes out a bit later, whistling and wearing a new "C++ 11/14/17" suit that might not quite fit yet, but will soon...
(RAII and smart pointers, hurrah!)
If you wanted to make a difference, you could insist on some of the better process improvement initiatives - CMMI, Test-driven development, effective Code and Documentation review techniques...
The Rotodyne failed because it had several rather impressive disadvantages. It even got debated on ARRSE... the comments from an aero engineer were twofold:
http://www.bbc.co.uk/news/magazine-35521040
The politics angle is a bit of a red herring IMHO. It was the excuse used by the idiots who tried, failed and continued to try in the face of all evidence and the laws of physics to produce quiet tipjets. Everyone else looked at something that could stop conversation two miles away and decided not to buy it. I doubt even with today's design tools and materials you could do it. Even if you could, the fact that you're putting high pressure air, fuel and ignition power up through a rotating rotor hub where a single failure causes a catastrophic loss would make the safety case distinctly challenging.
and...
No, the manufacturers said they could quieten the tip jets down if only they had more money and time. What they failed to do at any time was demonstrate any noise reduction to anyone. Now, either it's a conspiracy or maybe, just maybe horribly difficult engineering problems are just that.
Sorry. Didn't parse your sentence it the way you intended.
Didn't the iranians model their state on the UK ? - the head of state is also head of state religion, etc.
In our history, the AK-47 functioned as "the great equalizer" by making it expensive in terms of both money and soldiers to hold the European Empires together. There's a reason European countries lost most of the decolonization wars after WWII.
Be careful of that meme of "firearm as equalizer, allowing patriots to stand up against tyranny" - very powerful, very "Second Amendment", and much beloved of the US right.
You might equally argue that the decolonization happened because the emerging middle-class of locals withdrew their consent. Political problems don't typically have military solutions; Britain basically said "help us out in the Second Big Mistake, and we'll give you independence"; then spent a decade or two trying to withdraw. Empire became Commonwealth fairly quickly, all things considered.
You could argue that the decolonization wars (Rhodesia, Congo, Vietnam, Borneo as opposed to insurgencies such as Kenya, Malaya), weren't. They were proxy wars between superpowers or regional powers. Wars are hideously, unbelievably, expensive - no large-state-funded levels of cash injection, means no ammunition, means no (or a lost) war.
Missing, presumed dead. (This happens to quite a few minor name-check characters, and some less minor ones.)
I assume this applies to the Wu worldwalking family as well. Or will they reappear?
Exactly. Agreed completely.
The big tech win in CPU design was ever-smarter branch prediction
Alternatively, you could argue that it was down to ever-greater integration.
Look back forty or fifty years, to what we used to call a "CPU" - it was effectively the ALU, a program counter, and some addressing stuff; all in a simple Von Neumann architecture. It might not even all be on the same chip. As time went on, and with it the ability to generate larger and more complex designs, more and more of the circuitry was moved onto the same piece of silicon.
The architectural techniques that were covered in the "supercomputer design" module of my CS degree back in the mid-80s, are now regarded as entirely normal activities for a consumer-grade CPU; someone even emulated a binary compatible Cray-1 on a $150 FPGA development board...
http://www.chrisfenton.com/homebrew-cray-1a/
"I guess there's a balance point where you have enough but not too much... But saying "this is what you should do" is less likely to be right, and discourages initiative."
A balance point. Exactly. Ideally, we're not trying to make a voyage to the left or right, but to that balance point.
I actually met a MUMPS programmer once. Scary...
I was thinking that if you take the idea of a network chip that does a local, randomized version of pinouts and voltages, that if it detects an attack, it can decide to charge up a capacitor and clobber the other device. Once you've got full control of your network's voltages, "bad, black ice" isn't terribly difficult, though you'd only authorize it under wartime conditions.
I was thinking more about Algeria and Lebanon (more of an edge case as a decolonization war). There were political reasons for those wars ending, but I doubt they would have ended the way they did if the casualties hadn't been anywhere near as high? It's rare to find a decolonization war without foreign influences, but I would argue that that predates WWII as well.
"Be careful of that meme of "firearm as equalizer, allowing patriots to stand up against tyranny" - very powerful, very "Second Amendment", and much beloved of the US right."
It's irrelevant whether or not the US right loves this meme, the question is: is it accurate? Actually, I also agree that the emerging middle class was very important. I would argue that population growth also played a role in it as medical advances percolated from the metropole.
However, the Dutch lost Indonesia. The French lost Vietnam and Algeria (the latter with a VERY vocal settler population). I'm not saying that an AK-47 guarantees that the rebels win (it hasn't helped the Kurds in Turkey), but it does make such wars more costly in terms of casualties for the more advanced power.
What Adam Smith actually said ....
Most people who play the Smith card haven't read him. What he says about the role occupied by modern investment bankers and CEOs is distinctly uncomplimentary.
That's an interesting perspective, as I've spent most of my career in embedded systems (defence avionics, telecom, and consumer).
Sorry, should have said "manual garbage collection and (etc) are a bad idea in user level applications". Obviously you can't avoid it at a certain point — RTOS, kernels, back-end stuff. But it shouldn't be something that's relevant at the application, as opposed to system, level.
I was thinking more about Algeria and Lebanon
The Lebanese civil war wasn't really a decolonization war in the classic sense insofar as the French (colonial power) granted independence in 1943 and pulled all their troops out by 1946 — the war didn't start for nearly a third of a century (although there was a close shave in 1958). Unless you classify the Maronite christian population as "colonists", of course. As the Maronite church goes back to the 5th-7th centuries AD and the christian population of Lebanon was established way back, when it was part of the Ottoman empire, that's a bit of a reach.
Algeria I'll grant you: that was totally a decolonization war. But the AK-47 was strictly irrelevant: what did it was the deployment of classic Maoist insurgency doctrine, along with bombings, assassinations, and massacres of civilians. When you've got hundreds of thousands of men under arms, the choice of rifle doesn't make much difference on its own.
The "tech wins" for modern CPUs aren't based on primarily silicon fab technologies, they're based on squeezing more performance out of existing designs by adding extra stuff. A CPU that can do out-of-order processing is faster at tasks than one that is purely linear, branch prediction makes the optimisation of code a lot simpler and it executes faster.
The bad news for someone bootstrapping silicon production from cold, making six-inch wafers in visible-wavelength processes at 1000nm or worse is that the go-faster trickslike out-of-order require a lot of transistors and they're core parts of the ALU, they can't be fobbed off onto another chip 20nS or more away on the PCB. Cacheing common data and instructions helps a lot but each bit of cache requires more than one transistor what with flags, encoding etc. Wafer-scale might work but it never really did in our world, possibly because we got down to 20nm and finFETs faster than the WSI people could get stuff out of the lab.
Even better, keep the orbital nuclear battle-station in a third timeline. I'm guessing that submarines can't move between worlds due to contact with the water, but orbital and aerial deterrents would work nicely. I wouldn't be surprised if the US is keeping a nuclear bomber force in other timelines for that reason. Though the US, at least, is hampered in setting up orbital battle stations due to other space capable powers in their world getting nervous about that sort of thing.
just passing by ... you are quoted here : https://www.centauri-dreams.org/?p=38591
Sorry, I should have been clearer. I was referring to the war between Lebanon and Israel. I didn't even know about the war you mentioned. Thanks for the link.
And the reason I'm adding that war to the list is because I'm not sure whether Israel intended to annex parts of Lebanon similar to the way they annexed Gaza, the West Bank, and the Golan Heights, or would have changed their minds and done so had the war proceeded more smoothly? I'm afraid I am not well-versed on that war. That's why I called it a stretch.
Computer wars....
Algeria I'll grant you: that was totally a decolonization war
...with support to the FLN provided by Egypt and Nasser, involving equipment, training, and safe havens. Why do you think the French got involved in the Suez Crisis?
https://en.wikipedia.org/wiki/Algeria%E2%80%93Egypt_relations#Algerian_revolutionary_movement
I want C. Any language that does not allow you to get into trouble also doesn't allow you to do complex, good things far more easily. At that point, it depends on the experience and competency of the programmer.
Great, all you need is an endless supply of perfectly competent programmers who never ever make a mistake and C will fit the bill for any programming task. Over here in the real world where the sky is not pink and filled with sparkly flying unicorns programming in C is like chainsawing in the nude while blindfolded and drunk, only more dangerous.
C is Assembler on steroids. It has its use cases close to the hardware, bit-bashing registers etc. It has no place anywhere else especially where networking and security is involved. GUI? Nope. Database work? Nope. Anything with user-supplied data in buffers? Nope nope nope. Pointers? Absofuckinglutely nope nope nope.
Data abstractions are your friend. C is not your friend.
Charlie, a question if I may: how much of the NAC's government form was driven by meta-concerns?
That is, it seems like you needed the same group of people to be overtly in charge, that is, wielding direct and powerful centralized authority such that they can order armies to march, factories to be built, dissidents purged, etc. for multi-decade spans.
This is difficult in democracies; the average tenure of a British Prime Minister is something like a decade, but that includes some crazy 18th century outliers; the median is closer to three years. Other parliamentary democracies do even "worse" from a perspective of longevity; I'm looking in your direction, Japan.
It would seem like an American-style strong executive system might do here; being directly elected, theoretically long tenures are possible. However, American Presidents did four to eight years with the exception of FDR, who managed to get 16 years during a period of massive national crisis. And from a narrative standpoint, this sort of system might have negative connotations for how you want to present the NAC; an elected strongman who has ruled for two decades smacks of South American caudillos, of Saddam Hussein, of Vladimir Putin.
American legislative leaders get even briefer tenures than our presidents; while we have a long tradition of people serving in Congress for decades at a time, the median tenure of a Speaker of the House is about four years. (The average is somewhat higher because of the incredible outlier that is Sam Rayburn, who was Speaker for an incredible 17 years.)
All of this, of course, is vastly ill-suited from a perspective of wanting Miriam and associated members of her revolutionary cabal to continue to wield direct and overwhelming political power over a very long timeline, which you need to happen in order to make the plot go, yes? It isn't of much use to you if Burroughs has pulled a George Washington and Miriam is a private businesswoman or one legislative leader among many; you could probably write a good story involving that scenario but it likely isn't the one you wanted to tell.
So you adopt a government form that presents the trappings of democracy in order to maintain the needed ideological bona fides of the NAC for the purposes of your narrative, but also has an enormous strong authoritarian component, with a Supreme Leader who never has to stand for re-election and has immense personal power, thus allowing you to write a narrative where the nation-state in question responds to the wishes of your characters in a timely and efficient manner without an insurmountable amount of baggage.
Or at least, this is how I've reasoned it out. I could be wrong! I'm wrong a lot.
''Sorry, should have said "manual garbage collection and (etc) are a bad idea in user level applications". Obviously you can't avoid it at a certain point — RTOS, kernels, back-end stuff. But it shouldn't be something that's relevant at the application, as opposed to system, level.''
Sorry, but not really, though I agree with the principle of your point.
There are a lot of user level applications where the current fully-automatic garbage collectors cause serious problems - e.g. games (yes, ZDOOM under Wine, I am thinking of you), or anything in the high-performance area. Note that I am talking about factors of 3-10 slowdown, sometimes more. You can resolve some of the problems by adding hacks that disable garbage collection on certain objects or in certain locations (the answer to Martin's memory-mapped issue), but not all. A better solution would be a much better design of memory management, where the programmer can select the category of garbage collector and where it is better integrated with the hardware; Algol 58, Ada and Fortran all allow that, for a secondary stack. So it wouldn't be either entirely manual or entirely automatic.
And, while fully manual memory management is appropriate for the very lowest level stuff, most of the better embedded systems and kernels have at least semi-automatic memory management. Inter alia, some of those systems are the size and complexity of a large application, and have exactly the same problems with debugging, security and RAS. So a better solution would be exactly the same approach, but with a different set of selectable categories.
The reason that we can't have that? It's the languages. C++ is particularly evil in this respect, because it (often unpredictably) generates so many calls to the memory system in unclean contexts, and ways for the program and garbage collector to become at cross-purposes. C++11 safely-derived pointers were an attempt to resolve the latter, but there are still problems; I had an interesting discussion with Hans Boehm about this. And the unclean context is a real nightmare if you want to write high-RAS or high-performance code, especially if portability matters, because the programmer needs to know what the compiler might do behind the scenes, and defend against it, which needs skills that are rare indeed.
Pointers? Absofuckinglutely nope nope nope. Data abstractions are your friend. C is not your friend.
That's rather a blanket assertion, and hopefully you're limiting your tirade specifically to C - because there are some pretty successful GUI and networking libraries and tools out there, written in in C++ (not to mention quite a few written in C).
Anyway, programming is difficult. Pretending that "Language X will prevent below-average or sleep-deprived engineers from making mistakes" is foolish; and sometimes you need that power in order to achieve the task. Claiming that C-style raw pointers are considered harmful just suggests that you have poor discipline when it comes to your design... There's plenty of rubbish and buggy Java, Haskell, and Ada source code out there to make my point for me.
I may be biased, but then I remember changing from programming in Assembler, to programming in C. Then, after five or six years, on to C++ (granted, I was effectively writing C and using a C++ compiler...). I even went back to Assembler for a year, while doing some time-critical low-level networking stuff. There is a big difference between C and Assembler, I can assure you.
Think of C as "a programming language for grownups" ;)
Claiming that C-style raw pointers are considered harmful just suggests that you have poor discipline when it comes to your design...
Discipline is great if you can enforce it 100% of the time on 100% of your programming staff and be certain they never ever make a coding mistake which compiles but only triggers every second Tuesday. Me, I'd rather use an HLL that's not got pointers at all to tempt the unwary, the reckless, the clumsy or the hungover. Sure they can produce some rockstar code -- I've got some code I wrote a while back that uses auto-incrementing indexed pointers in a really neat way but it's horrible code, obfuscated Assembler -- but the programming industry is getting the idea that rockstar coders are bad for business because rockstar code is often problematic in the long run.
Sure you can write bad code in any language (Quantum Intercal, anyone?) but languages should not make it easy to screw up. Data abstractions are your friends.
Wow. You got this completely backwards. The first "fast" computer was the CDC 6000 series. Definitely a RISC like design (built 2 decades before RISC was "invented"). It ticked Watson Jr. off no end, its performance just made the 360 look silly.
The 50 cent CPUs were microcontrollers, not general purpose CPUs. Different rules apply for cost/performance tradeoff.
If I had the chance to do the microprocessor revolution over again, I wouldn't start with the 6502, which you pretty much had to assembly language program. You want to start with a simple, but relatively powerful, architecture that takes HLL into consideration. I would submit something like the National Semi 8900 (aka PACE) which was modeled on the Data General Nova. It came out within a year or two of the 6502 and the ISA is friendly to HLLs. Further, since it was 16 bit to start with making faster versions when more transistors are available (through pipeline, etc.) was possible. The 6502 is an architectural dead-end.
And if you are going to try and avoid T2 mistakes, go right to a 64 bit arch, skip 32 bits. The R4000 came out in '91, I suspect it could have come out earlier.
All IMO of course...
thus allowing you to write a narrative where the nation-state in question responds to the wishes of your characters
Nope. The real issue here is that the Commonwealth is a post-revolutionary empire. Your precedents are France after the revolution/terror, or Russia, or Iran (hint: Persian empire). In all these cases, after an initial period of turbulence/civil war/terror an authoritarian central power emerged. And a common factor was a strong external enemy/threat. (France: Britain. Iran: the USA and Saddam's Iraq. USSR: everybody jumped on them with fists right after the revolution.)
Again, there were some democratic aspects to the early USSR (between civil war and terror, before Stalin got his boots under the table) and Iran features a multi-party system. France ... got Napoleon, who was sui generis, but arguably an early prototype for the 20th century dictators who emerged in post-monarchical power vacuums.
Anyway: the Commonwealth forms in a crisis and is confronted by international (and internal) threats. As an empire I don't see any way it isn't going to gravitate towards an authoritarian core; if anything, Miriam's provision of Cliff's Notes On Revolutions In Time Line Two And Their Failure Modes is a major push towards bedding in democratic values which might otherwise have been submerged by a counter-revolution/terror; but after 15 years, it's wearing thin — and that's roughly when our story starts.
(As for why Miriam is still involved? World-walking. The Commonwealth doesn't have anything like the biotech base needed to build ARMBAND from tissue cultures, so they're dependent on the Clan survivors, and they need paratime transportation because they know the USA is going to come calling sooner or later.)
CDC 6000 series was designed -- and many hand-built -- by Seymour Cray. The Cray-1 instruction set was pretty much identical, but with vector register instructions added. The Cybers used secondary, external processors to handle I/O; these processors were barrel-rolled (that is, they had one "core," but 10 sets of registers, and cycled through them every clock cycle).
If you parse RISC to mean "Reduced Instruction Set Complexity," you pretty much start with the Cybers. (The failing there was that writing to registers A1-A5 would result in a read from the corresponding memory location going to the corresponding X register; A6 and A7 were the same, but wrote to memory. However, instruction parsing was trivial, and all the instructions were implemented in discrete transistors.)
(Fun fact: the CDC 6600 had a foot-long wire that connected two points only about an inch apart. At least one CDC tech broke customer equipment by replacing that wire with a shorter one. See, context switching on the Cybers was done by an eXchange Jump instruction, which just swapped register contents with a memory location. And Seymour, in an effort to cut down on transistor count, used the speed of the wire to avoid temporary space -- it did a read from memory, and then immediately wrote to the memory, using the long wire to act as a buffer. Replace it with a shorter wire, and everything broke. Much hilarity.)
Sorry to rant about one of my favourite computers.
I'm really considering desktop CPU systems, a few thousand bucks in 1970-era money. For those systems a RISC CPU would need more program RAM since it takes more individual instructions to be read from RAM and executed to carry out the same operations as a CISC CPU. More RAM means more expense and the speed advantages of simpler instructions could well be eaten up by the extra read cycle times unless there's on-CPU instruction cache which requires lots of transistors which the silicon fabs couldn't quite manage at the time.
Big Iron systems would cost millions of 1960s dollars and with cost no object, RISC-type instruction sets could have an advantage although the reads might be costly in performance penalties. According to Wikipedia the CDC 6000 series had instruction caches, big enough to hold a complete loop if it was small enough. I don't know the first integrated CPU that had instruction cache, it might have been something like the Cyrix variants of the 386 chip.
Yeah. The best description I saw of C was "a teenage hacker's wet dream". I am teaching software design, starting tomorrow, and will tell the kiddies (i.e. graduates) that their worst enemy when programming is the idiot at the keyboard. 15 years activity on SC22WG14, and other relevant experience, did not make me think any more of C as a suitable language. Yes, I can write C so that it is near-bulletproof and ports (code unchanged, where the standard permits) 25 years later to systems that weren't invented when I wrote it, but damn few people can.
''If you parse RISC to mean "Reduced Instruction Set Complexity," you pretty much start with the Cybers.''
FAR too late! What people now forget is that ALL computers of the 1950s and almost all of the early 1960s were far less complex than ones of today; remember discrete components (or even discrete logic)? Yes, Seymour Cray was influential, but RISC goes back to the very beginning of the modern era. The ICL 1900 range (1964) was designed for reduced complexity, for example, and the initial IBM System/360s were simpler than most RISCs of the RISC era.
I haven't read the book(s) as yet, but if you were going to start the computer revolution, you wouldn't start with desktops, you really do need to build discrete component systems first. You need the compute power in order to run the calculations to build high tech stuff (fun fact, Michigan States first cyclotron was designed using a CDC 6500, later the Super conducting cyclotron (not the big one, the one on the MSU campus) was designed using the CDC 170/750). The CDC 6400 was a lot smaller and about 1/3rd as fast, which allows all sorts of calculations.
As far as a small, discrete component, computer the PDP-8 is usually the benchmark (similar to the CDC's I/O perpherial processors in some ways).
And as I said in an earlier post, for desktops I would start with something like the Nat. Semi. 8900. Not RISC, but not CISC either.
C has it problems (a bad programmer can write bad code in any language...)
But the language the send chills down my spine these days is Javascript. What a nightmare.
My unfavourite one was TeX, until I did some complicated C++ template programming. In neither case, can you introduce any diagnostics or checking without changing the result. I have just spent some time trying and failing to get the LaTeX verbatim class to use a user-defined colour, with the new colour system and can do it with a built-in colour - I could do it with the old one, but the new one has defeated me. Before anyone mentions fancyvrb, I have already got other hacks to cover up other deficiencies of verbatim and don't want to start all over. But attempting to write a decent (i.e. up to the standards of Algol 68 / Fortran / Ada) matrix class in C++ templates has defeated me, Boost and Bjarne Stroustrup.
My point is that there is a huge amount of software that is like that, including most windowing systems.
if you were going to start the computer revolution, you wouldn't start with desktops
The first real usable desktop was probably the Apple II, in my opinion. The first real usable program which made the Apple II a success was Visicalc. Businessmen would go into computer stores and say "Gimme a Visicalc!" Mainframes could not really do "what if" spreadsheet operations as batch jobs and they were not really amenable to run on multitasking OSes, not without some graphical capabilities and the terminals for such were either not fit for purpose (Tek 4014) or horrendously expensive (Whirlwind). An Apple II, a couple of floppy disc drives, a monitor and a copy of Visicalc was two thousand bucks, a business expense almost in the petty cash region.
Word processing is another killer app for desktops[1] but it was the spreadsheet that pried computers out of airconditioned suites and put them onto desks.
[1]I earned a bit of beer money using TROFF on the University VAXes writing up theses for Ph.D students who were being billed 50 pence a page by the departmental secretaries. The fact I could correct stuff, produce proofs etc. on free lineprinter paper before sending the output to the expensive daisywheel printer for them was a bonus. There was, however, no spreadsheet program for the VAXes as far as I could determine.
Charlie, would it make sense for the Commonwealth's computer scientists to have read, and profited decisively from, John Backus's lecture "Can Programming be Liberated from the von Neumann Style?"?
(Best link I could find was this PDF, sorry: http://www.cs.cmu.edu/~crary/819-f09/Backus78.pdf.)
The first several pages pretty much nail the gist. Backus knew how to write a technical article.
The key point is the lost opportunity embodied in the unbroken inheritance of inefficient code style from the old times. This goes back to bloated information topologies baked into computer language design -- at an extremely fundamental level -- as a response to the high cost of hardware available during the wartime efforts of the 1940s. The Commonwealth would be faced with a different situation, I think?
Backus's critique of the fundamental style of our timeline's computation might serve as a useful plot point in this story of crosstime competition.
I suspect that the reason isn't the actual weapons as the differing attitudes towards casualties (and the resources required to maintain a fighter). Remember your Kipling:
With home-bred hordes the hillsides teem. The troopships bring us one by one, At vast expense of time and steam, To slay Afridis where they run. The "captives of our bow and spear" Are cheap, alas! as we are dear.
http://www.kiplingsociety.co.uk/poems_arith.htm
Me, I'd rather use an HLL that's not got pointers at all to tempt the unwary, the reckless, the clumsy or the hungover.
So... how do you plan to achieve indirection?
As a more concrete example, consider a modifiable data structure, say... a list? Insert something half-way-through?
What if you have a program handling structures of large objects, or large structures of small objects - are you going to embed all necessary structures into the language and hide any concept of "pointing", or are you going to accept large amounts of shuffling around, and write off the inefficiencies to Moore's Law?
Personally, I'll stick with the first option in Hoare's perspective: a function is either so simple that there's obviously nothing wrong, or complex enough that there's nothing obviously wrong.
That, and I do love Boost and the STL. Having had to write my own data structures back in the days of C (embedded stuff, so they had to be utterly bulletproof), I'm a firm believer that it's arrogance on a stick to insist that you can do much better than the Boost/STL teams, and with a lower defect density.
Screw the "rockstar code" stuff - I want to win simplicity awards, not "make people gasp in awe at the code complexity". I want it to do just what it says on the tin, without needing patched, until the requirements change. My ideal is for someone to try and maintain what I did, and say "well, that makes sense - this looks easy!" or "OK, so that's obviously what he's trying to do here" because I've commented or documented it to a sensible level.
Basically, hire good people and train them properly. Pick the ones who want to keep learning, not stagnate; and who want to be better today than yesterday. Give them time to think and design, rather than insist that they just get coding. Give them some time to rework and harden, rather than insist on a constant sprint. Give them the tools they need to do the job (build / test / debug tools and infrastructure, not just "here's Eclipse CDT, on you go"). Insist on code reviews and objective quality measurements. Put them in properly-resourced and balanced teams.
Re: '... introduce any diagnostics or checking without changing the result.'
Have wondered why checking SW as it's being written never became standard. I get that the entire OS can be enormous but since most apps are modular (do/connect to only one or a handful of specific things), testing each module before it's added to the OS seems reasonable/prudent.
From personal experience, another screw-up source to consider is consumers who accidentally find a way into the OS and unknowingly bugger everything up, i.e., the leave-all-doors-open-and-see-what-happens school of computer SW architecture.
tuberculosis
Is there a place that summarizes all of this somewhere? Looking for a reference, not an argument.
Someone else pointed out that the absence of central authority gives you Somalia. Good point. So I will modify my position to that historically over-controlling has made things worse.
Worse than Somalia? Not much of that on the planet. They are in the bottom 10% of places to be on earth. IMO.
You're right that the USA stepped up central authority in WW2. But the point I was making is that Germany, which had more central authority over development and production, did much worse.
Germany pretended to not have to switch their consumer industry to war time until very late in the game. The US switched as fast as the Feds printed the money. Interestingly I suspect the population of Germany would have been more in favor of the switch early than that of the US.
There's a essay called "Losing the War" which touches on some of this. http://www.leesandlin.com/articles/LosingTheWar.htm A main point was that the "bad" guys in WWII didn't know how to give up even after they knew it was hopeless. But there's a lot of interesting observations throughout the somewhat long essay.
I decided about 30 years ago that most of what are called programmers are just coders. Only about 10% actually "program". I.E. Write code that does something in a reasonable optimal way.
What you did with all of those embedded systems required the 10%. Or maybe the 1%. Much of what I did back in the day was the same way. Bringing in "good programmers" from a major corp was almost always a disaster. Their mainframe or similar experience had warped them into a "who worries about memory or resources" mindset that most could never get over.
But but but but
Most of what the 10% of us did filtered back up to be used by the 90% without appreciaation of how bad an idea it was to code major payroll applications for 100K people the same way your wrote code for an embedded system controlling a jet engine.
But since the demand was there and the 10% could not come close to filling it we would up with the bottom 30% implementing payroll and such without really an appreciation for what they were doing. Especially in thinking about how things they were doing in the moment would affect things 1 or 5 or 10 years down the road.
but damn few people can.
And only a subset of those actually do.
I wonder how much of current production code is really an experiment that was never cleaned up.
Hmm. I hadn't realized that Kipling wrote about the US Vietnam war.
Basically, hire good people and train them properly. Pick the ones who want to keep learning, not stagnate; and who want to be better today than yesterday. Give them time to think and design, rather than insist that they just get coding. Give them some time to rework and harden, rather than insist on a constant sprint. ....
Great concept.
The problem is that in almost every case you will be taking too long (define as you wish but you're never in charge of all the time lines) and have a very hard time finding these people. Mainly because a lot of other people want them and they will command astronomical pay in the minds of HR. And then 1/2 to 9/10s of the ones you wind up with will after bringing them "up from the minor leagues" will leave in a year or 4 to double their pay over the obscene amount you talked HR into paying them to stay.
My unfavourite one was TeX, until I did some complicated C++ template programming. In neither case, can you introduce any diagnostics or checking without changing the result.
I haven't done anything in TeX (only really made one LaTeX template in addition to using it a lot), but the C++ templates also complicate testing.
I once worked in a large project which was done mainly in C++ (and Qt, which constrained the C++ to a nice limit). Our team was responsible for an application on the system, and we used rigorous unit testing. We had a lot of dependencies to other teams' modules, and for unit testing we mostly wrote simple mock-up versions - it wasn't smart to unit test their code in addition to ours, so the mock-ups just returned easy answers.
Except for this one library, which was originally written in C but a one-person team was assigned to write a C++ interface for that. The interface, while simple in principle, used almost everything you could use in C++, starting from templates and ending with everything happening in the constructor of the class. I spent a couple of days trying to write the mock-up version, then asked a neighbouring team how they handled it and decided to do the same thing: just use the library as-is, as they had tried the same as I had and given up.
In the end we skipped the C++ wrapping and used the C library directly, if my memory is correct.
The Commonwealth doesn't have anything like the biotech base needed to build ARMBAND from tissue cultures, so they're dependent on the Clan survivors, and they need paratime transportation because they know the USA is going to come calling sooner or later. Which means that the Commonwealth need to capture someone using "Armband" & then reverse-engineer the product for themselves.
And at the other extreme is a state with a totally-controlling central authority, with results just as bad, possibly worse than Somalia's - the "DPRK" Which suggests, as is almost always the case in politics, that the "correct" place to be is somewhere in the middle .....
Doesn't it show that the correct place to be is at the top? :)
Except for this one library, which was originally written in C but a one-person team was assigned to write a C++ interface for that. The interface, while simple in principle, used almost everything you could use in C++, starting from templates and ending with everything happening in the constructor of the class.
That's not a flaw of C++, that's a failure of oversight by the rest of the team / team lead. You don't wait until delivery to review the design, and it's not a healthy team if everyone spends their time in a silo...
Anyway, I'm a cynic. At a guess, that one-person team was trying to build up their CV / resume for another job? And reckoned that they could get some hours of practice done on Company time?
Don't knock "how will this look on my CV" as a major factor in the selection of the technology to be used on a new project, particularly among the (shall we say) "more ambitious and self-interested" fraction of the population.
You don't have to be particularly ambitious to take that sort of thing into account. I recently turned down a very interesting looking job largely because the tech they used would lead to CV rot and leave me unemployable* within a few years.
No marketable skills = no leverage, and I do not want to end up trapped.
*in my preferred area anyway.
I agree that the whole thing wasn't about C++, and more managerial control would have helped. The templates were the techical thing which created the most problems there, so I kind of blame them, too.
They are useful sometimes, yes. I think what would like in a software project would be to make testing easy and commonplace, from unit testing to testing the whole system as a whole.
NO - or certainly not at the extremes, where being at the top makes you a real live target.
The trick is to get as much of the countries wealth into your swiss bank account as possible and jump before the revolution.
Guess who used to play too much "dictator" :)
,,,until the neglected foundations collapse under you.
Uh, by keeping a pointer to it? Garbage collection is not swapping, it's about making sure unused memory is reusable.
What is this "secondary stack" of which you speak?
((*) for versions of Fortran <= 77. I know nothing of Fortran >= 90).
Should, of course, read
((*) for versions of Fortran <= 77, I know nothing of Fortran >= 90).
Fucking HTML.
Mainframes could not really do "what if" spreadsheet operations as batch jobs and they were not really amenable to run on multitasking OSes,
Er, no. Mainframes were quite capable running spreadsheets, but a combination of politics & economics actively discouraged it.
Politics: Lots of programmers were employed by IT to develop programs for other departments. Putting out a generic spreadsheet program that anybody could use really undermined job security.
Economics: Mainframe time was expensive. Yes, micros could do the job cheaper.
Of course, having non-programmers generating spreadsheets created all sorts of bad answers.
The ARMBAND devices aren't really something to reverse engineer. It's a bit of cultured Clan-brain goo and a fancy chip to turn it on. The chip isn't the hard part. The hard part is culturing the brain-goo that makes the whole thing go.
"Complete absence of AK47's in all three examples."
It's not really about AK47s. It's about what happens when the locals pick up enough education and technology that it becomes expensive for the colonists to keep them in line. This happens about the time the locals have decent high-schools and the ability to make/pay arms dealers for semi-automatic rifles and explosives.
The AK47 is merely a good pointer to the fact that the colonists don't have enough military advantage to dominate the locals anymore. If anything, it's symbolic.
In Greek terms, the tragedy of Empire Games and Dark State is hubris. The Americans in the books are very typical of my countrymen. They imagine that they are substantially better than their opponents, but the reality is that if they're lucky they're merely running into another Vietnam. If they're not lucky... they get to be the bad guys in a rerun of the final battle from Niven and Pournelle's Footfall, and if I understand OGH's hints above, the Commonwealth is building more than one Orion spacecraft.
In the subsequent inquiry, Colonel Smith will attempt to invoke the "Fithp Amendment," but nobody in dark state U.S. will get the joke.
I should have written "...don't have enough military advantage to dominate the locals cheaply anymore."
Nojay, I expect better of you than this strawman rant.
"I want C" != "all software should be written in C, and everyone should write C", and you know that perfectly well.
High level languages have their place (says the guy who loves awk)... but you do not write an o/s, or drivers, or service-level code in them.
If I were to turn your argument around, I'd offer an entire o/s written in java.
Two more things: programming style is not taught, though perhaps some of it could be. For example, I had to learn while working, but as I used to say when I was jobhunting, I write code such that when I get a call at 16:15 on a Friday, or 02:00, that there's a problem that needs fixing, I don't want to spend hours, nor should anyone else need to do so, if I'm not around, trying to figure out how I'd been so "clever". I want to leave work on time, or go back to bed after fixing it, quickly.
Peer review was supposed to do that. Now, I assume, it's teams... and far too much of the time, that doesn't happen, you just get "looks good to me".
The other is management. It's always management. "Sure, I want to you do good work. Now, you've got 13 hours to write this system, or enhance it, and it needs to go into production tomorrow morning. Do whatever it takes...." And yes, I've been there, and nearly achieved clinical burnout, according to a friend who's a practicing degreed psychologist (if Kelly Higgins is reading, hi!, see you at Windycon, I hope).
You wrote: A main point was that the "bad" guys in WWII didn't know how to give up even after they knew it was hopeless.
Most of the time, it seems, they never do, unless they're forced, like Nixon, or like Trumpolini will be.
Note, for non-USans, no more than two was tradition. After FDR's death, the GOP rammed through a Constitutional Amendment making that so.
Oh. Ghu. The Cyber 6000.
In fall '85, at Temple U, I had an o/s course, and was on a PDP-ll, running RSTS? A time sharing version, and it was lovely to work on. In the spring of '86, I took a compiler design course, and we were on the Cyber 6000. Which ran NOS, which I started referring to as the Noxious Operating System. On the PDP, dial in from work, uppload my code, compile and go. On the CDC and NOS: log in, go into text mode, upload my code, get out of text mode, save my code, and then get it, because it had no idea it already had it before I could compile it.
It may have been fast, but miserable to work with.
Sorry, simpler solution, lower-cost, faster to produce and get on desks: terminals, connected to a mainframe.
With the advantage that it's a LOT harder to steal data, and just stealing a terminal doesn't get you what stealing a desktop computer would.
Und ve know vhere all ze terminals are!
Grin :-) As I posted, "Algol 58" was just finger trouble - Algol 68 did. And it is now 2017 - Fortran 77 was superseded 27 years ago.
It's interesting how NAC might bootstrap a computing industry; as above, maybe going straight for massive parallelism and functional languages.
But there's a warm-body-pipeline problem to bootstrap, and an enormous set of automation low-hanging-fruit - so in our timeline, we ended up with COBOL, simply because there was a large amount of (nevertheless valuable) simple processing to be done on bulk data. (When you have maybe some dozens of cpu-cycles'-worth of code to run per block of stored data, the bottleneck is the IO, not the algorithmic efficiency {or the coder smarts to implement it} ).
"High level languages have their place (says the guy who loves awk)... but you do not write an o/s, or drivers, or service-level code in them."
Yes, people have done, very successfully. Several of the Algols (including Algol 68, as I can personally witness), and I have been told of others (though I now forget the details).
"Two more things: programming style is not taught, ..."
I am doing so at present.
"Path dependency (and pork-barrel politics) lock us into sub-optimal solutions like the aforementioned Osprey — five gearboxes flying in loose formation — because the cost of reducing the noise level of the Rotodyne — an "unproven" technology — requires veering off the beaten track, and existing market incumbents don't want the competition"
Changing direction is hard.
A friend was a senior scientist at Xerox Parc in the glory days. Xerox knew they wanted to change, and did some very smart things but could not overcome their internal organisational inertia.
I like the cigarette lighter in cars as an example. Which I use to run a USB adapter, to charge my flashlight. Evolution leads to local maxima, not to global maxima.
Perhaps you (Charlie) already considered this, but just in case: For seasonal affective disorder (SAD), check for a vitamin D deficiency, or simply include sizeable vitamin D supplements with your diet. 4000 IU/day is the upper end of what is generally recognized as safe. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4673349/
Car cigarette lighters aren't all bad - high current sometimes counts for a lot.
I sometimes use mine to run an inverter that then powers things that would not react well to being run on a USB connector. I need to be careful not to run down the battery though.
I tend to find them slow and awkward to use, and it is accordingly harder to find a suitable slot in time to transfer some concentration from watching the road to using the lighter than it is with an ordinary butane-fuelled lighter. They are also no good for roll-ups, because they rely too much on pressure to achieve heat transfer to work well without the structural integrity of a chemmy to resist the applied force. On the other hand they are rather well suited for the tobaccoless smoking of hash.
Thanks for the note on autogyro hybrids. I did some quick digging and found that the Osprey's range (about 1100 kilometers) is a bit less than the autogyro record range. However, it's a lot less than the record range for a helicopter (2800 km), and that leads to the interesting problem: the USMC currently loves them some Ospreys because of Afghanistan. IIRC, Afghanistan is out of military heliopter range of the Persian Gulf, meaning that if you want to get marines to Afghanistan, you've got to put them in planes, offload them in Kabul or wherever, then put them in helicopters or on the ground to get them to wherever they're needed. That, and the minimum wage the jarheads are collecting, makes them hugely expensive soldiers. Or you can load a couple of squads into an Osprey and fly them right to the firebase.
Since there are stories of stealth helicopters in the US "crown jewels" (secret aircraft used for extreme missions--stealthy birds have been around since at least the Vietnam War), and they supposedly have sound suppressing curved rotor designs, I do wonder if the US secret arsenal includes a stealth rotodyne. It's not impossible, but unlike the stealth helicopters and the flying dorito chips, no one's ever reported one.
"Their mainframe or similar experience had warped them into a "who worries about memory or resources" mindset that most could never get over."
It works the other way round, too... at least, I certainly find I'm so accustomed to thinking in terms of 2k total physical memory that it sticks in my throat to statically allocate a 2k buffer for something even when I've got 32G to play with. And I can't use an interpreted language without constantly fretting over how much elephant grooming is going on behind the scenes when I do some simple thing like concatenating two strings.
Questions from a non-techie:
What is the minimum number of computer languages that a world that's never had computers before would need in order to create its own computing infrastructure? (Which languages and why. It seems that different computer languages were created for different purposes.)
How does AI fit into this computer-lingo Babel given that AI is widely anticipated to become the dominant computing creation? (I'm assuming that a sufficiently advanced AI could end up creating its own language much like every generation of human kiddies adds new words/concepts to our lexicon.)
Lastly - are there any languages that are AI-proof?
As for electric cars, our Bolt weighs 1624 kg, of which 440 kg is the battery, and it gets probably around 400 kilometers on a charge in average driving. A rough comparison car (pulled out of a random orifice) would have a mileage of around 50 km/gallon of gasoline, and gas weighs around 2.86 kg/gallon. So basically, the gas to go 400 km in this hypothetical car would weigh around 22.9 kg, or around 19 times lighter than the Bolt's battery weight.
Now obviously this is a problematic comparison, as the battery can be recharged for around 200,000 km while the gas gets burned. Still, the reason civilization went with petroleum starting 150 years ago or so is that it's ridiculously energy dense. Moreover, when it's gushing out of the ground, it's really cheap to make too. It also lends itself to all sorts of warfighting and other scaling (imagine running a tank running on current generation lithium batteries), where batteries do not (imagine recharging a tank battery in the middle of a blitzkrieg. I suppose you want your BOLO now).
The problem we face now is, as noted, making electric storage as energy dense as gasoline without making them noticeably more dangerous gasoline. However, we also face the problem of making those electricity storage and the associated infrastructure about as functional as current oil infrastructure is. Basically, civilization hangs in the balance on this one, and we can but hope to innovate fast enough.
Indeed, we may be living through some version of the Fermi Paradox right now: if fossil fuels normally accumulate in Gaian-type biospheres, the first intelligent life form that discovers how to use them for fuel ignites a massive boom for its species, only to find out the trap too late to do anything about it. After the crash, the survivors lack the energy (literally) to get off the surface of their planet or even to make powerful radio signals, thereby not inventing starflight or even telling planets in other systems that they exist. They could exist for millions of years thereafter, and there would be no way (short of getting a really, really good space telescope trained onto their planet or inventing star flight), to know that they even existed.
Still, it's not all grimdark: the Bolt has the equivalent of a 200 hp engine under the hood, and unlike a 200 hp gas engine, it also does hypermiling really well too. Electricity is fun to drive, even in the face of a looming crash.
I mean the socket, not the lighter itseld. I don't use them to set fire to things, do use them to power equipment.
Only one language is actually needed, but a reasonable number would be half a dozen, including one 'general purpose', one 'low level' and one scripting (which would have database, Web etc. variants). Plus 3 more for things I haven't thought of :-)
"I'm assuming that a sufficiently advanced AI could end up creating its own language ..."
They don't need to be advanced, and already have.
https://www.bloomberg.com/view/articles/2017-08-03/robots-created-a-language-humans-shouldn-t-panic
"Lastly - are there any languages that are AI-proof?"
I have no idea what you mean.
There's no reason for there to be more than one assembly language, and one high-level language. In a planned computer revolution, you'd probably want to have a couple of different ones, but for the most part, sticking with a variant of Pascal would suffice.
We had so many types of computers, and programming languages, because it was done by multiple groups, competing against each other. Which is not to say that's bad, but it's not necessary if you're looking at what has happened over the last 50 years and starting from scratch -- start with an 8 or 16-bit computer, and then go to 32-bit and 64-bit using an extensible architecture (e.g., ARM).
I'm not sure what you mean by "languages that are AI-proof."
The entire 327x display mode of fill out a form, submit it, wait for a new form, was a big disincentive to the UI that most spread sheet users seem to want.
I never saw a UI on a 327x style terminal that would work for most uses of a spreadsheet. Maybe someone COULD come up with one but none did that anyone noticed or wanted to use.
And maybe some kind of document formatting language.
NOS's filesystem was a tape metaphor, so you had to do things like rewind a file after writing it. And files could have more than one EOF -- you read until you got EOT.
It was not in any way designed to do interactive character work, which was a side-effect of the PP I/O processors handling all I/O.
For our COBOL course, I had a choice of using a Z80-based, 8-inch floppy CPM machine... or using the Cyber. The CPM machine took, literally, 25 minutes to compile an empty program; the Cyber compiled a couple thousand lines a second.
For the assembly-language course, we were to use a PDP-11 running RSTS/E; it was so overloaded that I ended up writing an emulator on an unused 3B5. It was my first C project.
A fairer comparison would be the weight of the energy source plus the power train of each. I suspect even with a 4 wheel drive electric the drive train weighs a lot less than for a 2 wheel IC drive train.
Yes. Maybe. As long as you don't call assembly as a language. It's more of a people friendly way of showing hardware op codes.
And then you get to things like SQL.
It's a very fuzzy world when you start deciding what is and is not a programming language.
Not sure I agree with that. Not personally familiar with the National Semi (is that the hiatus in government before we vote for a new one?) PACE, but looking it up it appears to be very much like an 8086 minus the segment:offset mechanism to allow it to address more than 64k. It does not overcome the principal limitation that the 6502 ran into - which was the 64k memory limit, not the 8-bit arithmetic - but it does strongly suggest something like the 8086 as the next step, which I would submit is probably not the direction we really want to be going in. It is also really bloody slow compared to the 6502.
The 6502 was developed in a situation not unlike that of the Commonwealth in some respects, and has the two advantages of being very simple in terms of its internal circuitry so it is easy to fabricate, and being very fast because it isn't microcoded. As for being a "dead end", the 6502 is still around as a microcontroller core.
I probably wouldn't start with an unmodified 6502, but the alterations I would add are pretty minor in terms of complexity and don't alter its basic character. We might even have seen them on the original chip if it hadn't been constrained to 40 pins by its package. Off the top of my head:
32-bit addressing, of course, which is probably the worst one for complexity, but doesn't really involve much more than some extra duplication of existing circuitry. Fill in the holes in the instruction set (some of the later derivatives did begin to do this). Add the few signals (about 4) needed to permit a byte-slice configuration to handle arithmetic with more bits. Provide more bits for the stack pointer and index registers, and make the stack movable instead of fixed. Provide a register to bank-switch "zero page" so it can be anywhere in memory, which would be very useful for fast context-switching.
(It's worth remembering that at this level of fabrication capability RAM is faster than CPU, unlike now, so cacheing doesn't really come into it yet and the "zero page = lots of registers" idea works.)
Yes. I have always regarded anything that is, in principle, Turing-complete (ignoring resource and I/O format limitations) as a full programming language, but that includes a surprising number of things people don't think of as programming languages. It is useful because, inter alia, all such languages are vulnerable to all of the generic classes of attack. However, there are also things which could be described as partial programming languages.
sendmail configuration is turing complete. :)
I do wonder if the US secret arsenal includes a stealth rotodyne.
Impossible to know, but I will note that to a non-expert a rotodyne is probably indistinguishable from a slightly weird-looking helicopter. I mean, we have helicopters with tandem paired rotors, helicopters with contra-rotating stacked rotors, regular helicopters, NOTAR helicopters, helicopters with shrouded tail rotors, and plenty of choppers with winglets and underslung pods. Precisely how the rotor is spun up is not obvious to a non-specialist.
Re: "languages that are AI-proof."
I mean a language that a human could learn and understand but that an AI could not.
One possibility is: In English one word can have multiple meanings with the intended specific meaning understood only once the utterance is completed. (Not sure how many other languages share this feature.) However, because AI math includes looking at clusters (e.g., context for word usage) this suggests that English is not AI-proof.
Guess I'm actually asking what, if anything, is the underlying fundamental difference between human and AI language because if such a fundamental difference exists then there's also the likelihood of misunderstanding/miscommunication between the two (or secrecy).
Change of subject from the strange attractor of computers to the strange attractor of nukes (sorry Heteromeles).
The Commonwealth is willing to shoot down a stealth drone with a nuclear warhead missile. A nuclear warhead.
Quite apart from the relative costs of nuke and drone (which admittedly for a high tech stealth drone might not be as much as I think), aren't the Commonwealth worried about fallout? Or EMP effects on their own electrical infrastructure?
I'm imaging the USA in timeline 1 deliberately sending drones through on days when the wind is blowing towards the capital...
I keep typing a response, and then starting over, so I think I'm just going to have to quote history: "The question of whether Machines Can Think... is about as relevant as the question of whether Submarines Can Swim."
"Two more things: programming style is not taught, ..."
I am doing so at present.
I agree strongly with Elderly_Cynic on this - I've done it too, albeit on a far less grand scale (namely, to the software apprentices that passed through our team).
IMHO one of the problems is that when many programmers talk about "style", they mean coding style - the low level "camelCaseorunderscore", or "put m_ before the class attributes" stuff. Perhaps they reach the dizzy heights of insisting on Hungarian notation, or banning recursion and multiple inheritance.
Only rarely do you see departmental or corporate style guides that talk about design style - encapsulation, or preferred error handling, or the expectations surrounding complexity.
So, you try and teach by good example. You show the developing engineer some hideous code, and some cleaner stuff. You explain why it's hideous, or clean. You try and explain why those parts of your style relate to quality, those to maintainability, those to robustness, those which to testability. You review what they produce in a constructive way, explaining why (generally on balance) you prefer one way over the other. You couple the concepts of "process" with those of "design" and "style", and try to show how it fits together.
You admit when you're wrong. You acknowledge that there are exceptions to rules - because rules are for the guidance of wise men, and the blind obedience of fools :)
...
Of course, you can just leave them to it, and let them sink or swim. In two or three years, they'll have found something that works for them, they'll be telling themselves that they're now a senior engineer, and who is anyone to tell them how to do their job? They've got years of experience! And in another decade, they'll still have two years of experience, just five times over...
The US had a couple of nuclear air-launched anti-aircraft missiles, the unguided AIR-2 Genie and the guided AIM-26 Falcon. The Genie was test-fired as a live missile, I don't think this happened for the Falcon. No significant EMP, no significant fallout. I expect the Soviet union had similar missiles too.
The US government fired off about 200 atmospheric nuclear tests in western central US over a period of about 15 years. Fallout from those tests was detected in many situations including, famously Sr-90 in teeth but actual noticeable effects, bupkis. For various reasons the human race developed a lot of really sensitive tests for radioactive isotopes and find them when we look for them. Fallout makes for some really kickass game concepts though.
Re: ' ... as relevant as the question of whether Submarines Can Swim."'
Okay, I'll bite ...
Yes, this is relevant if the for-earth-ocean-designed sub is sent to and expected to 'swim' in Jupiter's 'oceans'.
but actual noticeable effects, bupkis err ... no I know that Eastman Kodak made representations to the US guvmint (for whom they were a sizeable contractor) warning them that it would be a reallly good idea if atmospheric nuke-testing could be stopped - or at the very least scaled-back a lot ... because the radioactivity being spread around was beginning to noticeably affect photographic film. Which the US guvmint used for aeriel reconnaisance. I think all the other major photo manufactureres were worried about it as well.
Birds fly, aeroplanes fly, but not like (most) birds. Submarines "swim" but not like fishes. AI will "think" but (?) not like a human (?)
Not sure if this is off topic, but it hits one of OGH's hot buttons, hard.
New white paper maps the very real risks that quantum attacks will pose for Bitcoin
Quantum attacks on Bitcoin, and how to protect against them On the other hand, the elliptic curve signature scheme used by Bitcoin is much more at risk, and could be completely broken by a quantum computer as early as 2027, by the most optimistic estimates. (pdf)
Haven't read it yet (will; potentially work-related), but - today's trading (Nov 2): $7,080.27 +4.89%
I do wonder if the US secret arsenal includes a stealth rotodyne.
Easy answer to that - no chance, with a side-dish of "are you insane"?
This is a technology where the noise levels at 300m were similar to someone shouting very loudly at your ear from a range of two feet. Where you don't just have a large rotor disk, you've also got unshrouded propellor disks flashing away to generate lots of HERM. Most of the stealth disadvantages of a V-22 with all of the disadvantages of a CH-53.
As pointed out earlier, you've also got high-pressure fuel and ignition mechanisms, passing through a single unshielded point of failure at the most vulnerable corner of the flight envelope (coming in slowly to land). In terms of resistance to battle damage, it's a joke.
There have been some impressive capabilities - but rather than a Rotodyne, you would be more likely to see CREDIBLE SPORT, STARS, or even an EXINT pod
Re: rotodyne When I found out what this was, I immediately remembered the failed Rotary Rocket SSTO project. (not sure it wouldn't have worked, they just ran out of money).
Re: 'AI will "think" but (?) not like a human (?)'
Agree - however, we're putting a lot more trust in and responsibility on AI vs. your other examples. With humans, we've a fairly good idea about how many and what types of things can go wrong. With AI ... ?
It's the 'but not like a human, so how' that I'm curious about.
The 8900 resembled (very closely) a Data General Nova.
Well, it was always a microcontroller. It just barely worked in early personal computers. As has been pointed out, it was very fast for what it did. Programmers learned how to get a great deal out of it, but why not have sane architecture that you don't have to fight with? (See also: 8088).
The strength of the 6502 is that it was done using 3300 transistors (similar to the 8008), but was actually useful. The weakness is that the compromises to fit in that footprint were, shall we say, not forward-looking.
All those changes you want to put on the 6502, just start with a decent architecture to begin with.
There were other mainframes than IBM and other terminals than 327x. e.g. VT52 on a DEC-10 (and lots of others)
Well, you only actually need machine language :)
In fact, the early CDC OS's were written in octal. They disassembled them later when they got an assembler working....
I'd rather not. My first job involved assembly code for a one-off signal processor stage, designed by a bunch of Swedes. It was a Very Long Instruction Word (VLIW) machine, with a 128-bit instruction; four-core SIMD with 12-bit data inputs, and 16-bit internal processing.
Different parts of the CPU were controlled by different "phrases" within that instruction - some related to the current cycle (IIRC the ALU operation, branch tests), some related to the next cycle (setting up the read and write memory address generators), and some related to the next-but-one cycle (parameter memory address generator setups).
A deeply fun experience - after we signed off the production version of code, I spent the next seven years hoping that they wouldn't find a bug and make me go back to it.
There's a bit more to the Rotary Rocket story than "They ran out of money."
A longstanding friend of mine was working there. I was on the phone with her very shortly after the shutdown announcement went out.
aren't the Commonwealth worried about fallout? Or EMP effects on their own electrical infrastructure?
Probably not. At high drone altitudes, say 10 to 20 km, fallout products will be finely dispersed, carried up and fall slowly, meaning that they will be dispersed over wide areas by winds at altitude and have many days to decay. And the wide area EMP producing phenomena that people usually worry about take place at much higher burst altitudes.
That said, it's not the kind of thing I'd recommend doing as a habit.
Yes, I wonder what the mortality rate of pilots would have been had they gone into production. It sounded like a scary beast to land.
The electric vs. petroleum comparison I care about, personally, is range and ease of re-energizing (filling the tank or charging the battery).
The silly-simple way to think of it is that if the energy density of batteries increases by about a factor of 10-20, then batteries take over for most applications (with the exception of main battle tanks and similar), because you can put a battery where you put the gas tank and the range will be the same as for gasoline. Yes, the power train matters too, but we already know how to run civilization on these kinds of energy densities. The rest are hiccups and retoolings. The problem is that battery energy density is 20 times lower, and that requires major accommodations and/or limitations (like using the car solely for short hops and commuting--while this is critically important, it's also specialized, and I'd hate to depend on an eCar to evacuate from a disaster).
Refueling does matter if time is money (or lives, as when refueling a tank during a battle, or a car ahead of a hurricane). If we had small cars with, say, a 120 kWh battery, we could deal with the range anxiety (120 kWh would send a Bolt over 450 miles), and charging overnight would be as good as tanking up, in most but not all cases. Such a car would be a nuisance to road-trip in, because you'd be limited by the car's need to recharge, not your own endurance. But it could be done.
There are some fairly important follow-ons from this, because we've built the US and other countries around technology that can send a car 400-odd miles on a tank of gas and refuel in a short time. If we're stuck with big, slowly charging batteries and no gas, we'll need to rebuild civilization to accommodate them, and this will take a lot of work (and petroleum expenditures) to accomplish (start with how many people have to commute over 100 miles/day and go from there). The more we can make electricity act like gasoline, the less we have to rebuild, and possibly the fewer problems we'll have with this particular part of climate adaptation (assuming humans all follow the better angels of their nature, which of course we always do all the time...). Personally, I think the follow-ons from wide-scale shifts in how we power things are probably more important than how often we tank up, but that's just me. Most people have been conditioned to care more about convenience than consequences.
The ultimate shotgun if you don't have a Test Ban Treaty, I guess. But you're right, it says more for their bomb-making skills than it does for their aim.
Speaking of aim, I wonder how you fly an autonomous, world-walking drone. Those things have a high crash rate to start off with, even when they're being remotely piloted. That crash rate would go up massively if they had to do long-duration flights autonomously, then return to some point, jump, and then pass back under control in order to be landed. What fun.
Being an innately silly person, I wonder why they don't simply rig a U2 with the right equipment and loft that. Those things are close enough to flying transformers as it is. Surely one more module wouldn't mess up the mission that much...
Does what you say hold true if we still follow the 80/20 rule? In other words, electrify only 80 percent of the US's transportation and leave the remaining 20 percent petroleum-powered? How much of the infrastructure would we have to rebuild then?
"it says more for their bomb-making skills than it does for their aim."
Well, that's kind of the point - they're doing the same thing that we did with early nuclear missiles, or on a smaller scale with flak - if you can't aim accurately enough to hit the target, use a bigger bang so it doesn't matter when you miss. (And the altitude is wrong for EMP, and as for fallout, not only do airbursts not produce much but what is produced will be carried by the prevailing winds into enemy territories, so they probably aren't too concerned about it.)
IIRC the US don't actually know this yet; they know something ate their drone, but they don't know what it was, the drone just didn't come back. So they don't yet know that the Commonwealth isn't up to US standards of targeting ability; seeing CRT displays on computers in the railway offices isn't much of a hint, partly because LCD flat panels are a different manufacturing problem from LSI chips, partly because there's no guarantee that an alien line of development would be aiming for them anyway and they have seen a few clues that it might not be, and partly because if it is like this world, you might well expect computer terminals in railway offices to be old and clunky compared to what you get elsewhere. On the other hand it means that the US also don't know that that timeline is much less averse to casually chucking nukes about than their own.
In English one word can have multiple meanings with the intended specific meaning understood only once the utterance is completed.
Mostly. One hopes. Heck I understood the meaning of the word "and" in Charlie's admonition about DT posting differently than he meant it. And I went back and re-read his sentence and decided yes he was right but I wasn't wrong. Big pond issue or fuzziness in English. Pick either one and still there's an issue.
Let's hope this Chinese spaceplane isn't vaporware.
https://arstechnica.com/science/2017/11/chinas-secretive-spaceplane-may-launch-in-2020/
Note that in my opinion, it sounds too good to be true, but who knows?
Been there. Done that. Got the shirt and hat. No thanks. I'll stop and write an assembler. At least a crude one.
Sort of did once. Way back in ancient times an async controller was not able to do what we needed. But it to use it we had to load a blob of code into the board first. So I looked at the board, saw the 8008 chip, went out and bought an 8008 manual and wrote a dissembler for the code. Quickly added some tags for obvious data and the ability to define labels. We then modified the code to do what we wanted.
Only 2K but they WAS in ancient times.
The mass infrastructure replacement problem arises because we do not have a programme aimed at getting off fossil fuel with the least hassle. (Instead we have isolated things going on which are not concerned with getting off fossil fuel, but which people not involved with them hope will have that result as a side-effect.) If we did have such a programme, the resources currently being used in scattered research into battery chemistry might be redirected into concerted research into the chemistry of artificial photosynthesis. Given that, there would be no need to replace any of the infrastructure apart from the actual oil wells; you just feed the current system with artificially-photosynthesised hydrocarbon feedstock instead of dug-up hydrocarbon feedstock, and keep everything else the same.
Back in the era we are discussing most of the computing was done on IBM mainframes and imitations. There was a lot of other stuff around in somewhat large absolute numbers but still dwarfed by the number of displays connected to IBM type systems. With all of the 327x headaches that created.
And yes minicomputers started to change that but talk about fragmentation. [eye roll] Then personal computers showed up and MS took over and suddenly the user interface options totally changed.
This whole discussion of computer languages and instruction set architectures is fun (though I think, often misguided). However, thinking about this from the point of view of the story, it seems to miss the real areas where borrowed tech matters.
The huge deciding factors in the development of Commonwealth computer technology isn't computer languages with formal proofs or whatnot: it's hardware and organized central planning.
To elaborate, the commonwealth gets to borrow 75 years worth of science, chemistry, semiconductor process technology, physics, and so on ad nauseam. What's that, 40 years of work to develop bright LEDs in every color? Psst here are a few chemistries to try, you should have that figured out in a few months. Integrated circuits? Let's have a little chat. We should be able to double density every few months, how about we call it Miriam's Law?
Having hardware development move at such a rapid pace means that software is really going to be along for the ride in a lot of ways. There simply isn't going to be time to develop a lot of well thought out, polished systems. Adam in #119 mentioned programmer-centuries that have gone into LAPACK etc., but the truth is that we've put that much effort into a dozen different operating systems, countless applications, etc. The commonwealth is trying to build the same thing, but far more rapidly -- they need software now and it takes time to write.
Doing something like formally proving software and hardware components, inventing really good polished high level languages and so on sounds great in theory. In practice, it seems like this scenario has things changing so fast, it's going to be chaos, with lots of shoddy, buggy stuff.
Trying to make everything good from the start, for example a secure TCP? Arguably bad second-system effect stuff. Say you need TCP in year 3, but don't need the security until year 10 (because everything has physical security until then). Not to mention the mathematicians... and on top of that TCP needs to run on tiny computers, but the security need appears when you're up to 32 bit, right? So the answer is to build what you need, but emphasize the ability to forward-port, recompile, etc. as you go, so the transition isn't painful.
The big thing the central planning gives you is the ability to avoid false steps. Segment registers? No, in three generations we're going to do memory mapping this way. You're having an argument about graphics? Not planar thanks. And so on.
Could you please provide documentation of the high drone crash rate, compared to piloted aircraft?
Reason I ask: there's some chatter going around about pilotless passenger transports, and I'd REALLY like to have some Class A mishap rate safety data available to shoot it down. If the remotely-piloted or autonomous drones have higher mishap rates than piloted airplanes, that has dire consequences for remotely-piloted or autonomous passenger and cargo vehicles.
Note: The insurance companies will also be looking at this, and they are known around the world for having No Sense Of Humor At All.
"It's a very fuzzy world when you start deciding what is and is not a programming language.."
I agree completely.
Notions such as "programming", "programming language", "computing", "algorithm", etc make nice intuitive sense. But poke them hard and the limits are not at all clear.
Nor is there any real reason to believe that these notions should have sharp boundaries instead of fuzzy ones. Lots of useful concepts are more like the concept of a "game": not a concept with sharp edges defining what's in and out but instead a bit fuzzy around the edges.
My (unfinished) PhD thesis was on philosophy of computing. I could write you a lot of pages reach that same conclusion if you wanted, but I think you've hit the nail on the head.
"However, thinking about this from the point of view of the story, it seems to miss the real areas where borrowed tech matters."
Absolutely.
Jumping to modern corn, wheat and rice hybrids would likely make a vastly bigger jump in GDP than everything to do with computers. I've no idea about cotton, but expect the same may be true there.
The % of the US workforce that is in agriculture dropped from 35% in 1900 to under 3% in 2000. While moving to producing vastly more food per head of population, of more types, that people like more. That ten-times increase is agricultural yields was a huge deal. I know, better crops didn't do all that - but it did at least a third of it, and moving seeds paratime should have been about as safe and trivial as you can get.
Cows that give more milk. Sheep that grow better wool (and reliably have twins). Etc.
Humanity's created biological capital is a very important part of our wealth.
Here's a quote from wikipedia "Agriculture in the United States":
"In 1870, almost 50 percent of the US population was employed in agriculture.[16] As of 2008, less than 2 percent of the population is directly employed in agriculture.[17][18]"
A 25 times increase in labor productivity is just amazing.
It's the 'but not like a human, so how' that I'm curious about. Aren't we all? But no-one at all seems to be addressing the problem, apart, perhaps from Mr P Watts ...
Looks like Skylon/HOTOL backed by PROPER guvmint money & long-term thinking, as opposed to, you know .... Good luck to them. IF it works, you could see Skylon given a boost, because, IIRC, no-one else is doing this.
Exactly so - the real test of "good code" is not how long it takes to write, but how long it takes to understand 10 years from now, when the guy who wrote it has gone on to other things (or even when he's still here but hasn't looked at it himself since it passed SAT).
And a couple of designs with side by side rotors with independent discs (I can think of a MiL and a Focke-Angelis off-hand), some Kamans with interlaced but not co-axial rotors...
Aircraft and submarines are devices for transporting pink meat, not organisms. UAVs are usually scaled down examples of aircraft (I'm well aware of US types such as the QF-4) and submarines.
Any configuration language that supports...
... is Turing complete.
s/other things/$& in the sky/
Suggest that the concept isn't necessarily wedded to tip rockets. Isn't the same overall concept still there if the main rotor is driven like a conventional helicopter for VTOL, but reverting to auto-rotation for level flight?
Yes, precisely. Defending oneself against a competent, well-funded attacker (especially one that has placed an agent internally) isn't a matter just of securing the main gateways and authority mechanisms, and draconian secrecy rules do nearly as much harm as good. If a configuration is static (fairly common) and only very 'important' and specially cleared people are allowed to view it, how long before an attack would be discovered? And, of course, imprisoning teenagers who hack in from a laptop in their bedroom (show trials 'R' us) is merely denial and vindictiveness.
I think you missed the point there ....
"I mean a language that a human could learn and understand but that an AI could not."
There are plenty, at present. Non-simple English or Japanese, for two - note that, at a certain level, inventing new words and idioms on the fly is part of English usage, at least this side of the pond and down under. Whether one exists at all depends on whether you believe the likes of Penrose and people like Pence, who assert there is more to human minds than emergent properties of describable biological actions. But, even if one doesn't, we don't yet know anything like enough about the human mind to write an AI that will match an intelligent human.
The tip rockets were relatively light, driving the lifting blades via a conventional engine in helicopter fashion means you've got either a large engine cutting into payload during cruise or some horrendously complicated power transfer system between the horizontal engines and the lifting blades. If you were designing a Rotodyne from scratch today you might go for an electric drive for the lifting blades driven by generators on the horizontal engines I suppose.
Ok, if I modify your comment to:
I still don't get what you're talking about -- Algol 68 gave the programmer no control over the garbage collector at all and I still don't see what you mean by "secondary stack", Algol 68 had a stack (for LOC generators) and a heap (for HEAP generators).They trouble with that is that as soon as you go away from the rotor tip drive to a mechanically driven rotor torque reaction rears its ugly head and you need to introduce complications like a tail rotor (or equivalent thereof) or contra-rotating rotors.
The thing about the rotodyne (and other rotor tip drive concepts) was the sheer simplicity, because so much messy, heavy, complicated, mechanically lossy, aerodynamically untidy stuff just goes away...
Sendmail as a Turing machine.
Quite apart from the relative costs of nuke and drone (which admittedly for a high tech stealth drone might not be as much as I think), aren't the Commonwealth worried about fallout? Or EMP effects on their own electrical infrastructure?
No more than the USA was, circa 1958-1988: see the Nike Hercules for a direct comparison.
Note that the Commonwealth air defenses are fine-tuned to deal with a stream of A-bomb toting French strategic bombers that could swarm across the Atlantic, the Arctic, and the Bering Straits at any time. A nuclear missile is used because that's what they've got (and if you wonder why in our time line gun turrets on bombers, and massed bomber formations with aircraft covering each other against interceptors, went out of fashion in the late 1940s/early 1950s, nukes-on-SAMs is precisely the answer you're looking for: it tilted the balance of power drastically towards the defenders).
Note also that, unlike the USA circa 1950, the Commonwealth began rolling out its modern infrastructure in the full knowledge of what EMP does to integrated circuitry. Expect their standards for personal computers to include Tempest shielding and hardening against moderate EMP,, at a minimum.
There is a certain type of person who thinks it is cool to make every little thing turing complete. I used to be one of them, but long ago came to my senses and concluded that they are insane.
The undecidable compiler is simultaneously the coolest and nastiest thing about C++. Who couldn't both be impressed and slightly nauseated by a compile time ray tracer?
"Exactly so - the real test of "good code" is not how long it takes to write, but how long it takes to understand 10 years from now, when the guy who wrote it has gone on to other things "
The C++ codebase I'm working on is from 1993-1994, according to the comments in the headers. Though it has been patched many times since, mostly by people who don't understand it.
Clean initial design, maintained since by illiterate gerbils who have defected in the corners and widdled on everything.
No documentation that I trust to be accurate. No vendor support, they shot the gerbils and gave up on the product.
One day your code will be like this.
As pointed out earlier, you've also got high-pressure fuel and ignition mechanisms, passing through a single unshielded point of failure at the most vulnerable corner of the flight envelope (coming in slowly to land). In terms of resistance to battle damage, it's a joke.
As I understand it, that's not true. Tip jets on gyrodynes are fed by compressed air through the drive shaft; you're not pumping fuel down the blades (unless you're confusing this with Roton?). Or maybe the Hiller Hornet (tip-jet ramjets for propulsion? Yow!).
You've got a somewhat simpler gearbox and rotor hub up top—it's essentially an autogyro, with tip jets to spin up the rotor so it can take off vertically (unlike a classic autogyro). Fairey claimed their design studies suggested they could have cut at least 10dB off the noise output by making some simple changes to the second prototype, never mind the results of a decade or two of active development and new airframes.
(Back to the novels: my thought is, if you want a V-22-like capability and don't have the ability to throw three decades of tiltrotor research and several tens of billions of bucks at the problem, a rotodyne is a much cheaper/lower tech option. Many of the same drawbacks in terms of noise/range/stealth, but it's able to autorotate (hence much simpler gearbox arrangements) and you can build it with 50s tech, not 90s tech.)
OK. CREDIBLE SPORT I knew about — you know the videos of the C-130 trying it for the first time (and failing, spectacularly) are on YouTube? — STARS is something I was looking for the name of (and background details) because I'm using something very like it in a book, but the EXINT pod is just bugfuck crazy. Ahem. Maybe not crazy to the sort of folks who try stowing away in the landing gear of a commercial airliner, but I'd still carry a change of underpants and a lucky charm or sixteen.
Let's hope this Chinese spaceplane isn't vaporware.
I've said it before and I'll say it again: a spacecraft needs wings, a wheeled undercarriage, and a runway like a fish needs a bicycle.
The commonwealth is trying to build the same thing, but far more rapidly -- they need software now and it takes time to write.
Yes. And they also have a huge handicap: lack of human capital.
We've had general-purpose digital computers and programming languages for nearly 70 years, but it's only in the past 35 years or so that programming has become a mass activity — I'd date it to the early 1980s, when computer labs became ubiquitous in high schools and home computers became affordable.
The importance (to Miriam) of rolling out washing machines and launderettes in factories everywhere is that it frees up an immense amount of human capital for activities more productive than bashing fabric on rocks. Similarly, the importance to the Commonwealth of rolling out home computers is that it generates a culture in which a certain proportion of young adults self-teach themselves the basic idea that a computer is programmable, and can then be filtered off and trained to do the job properly.
Lest we forget: around 1980, a "wildly successful" personal computer maybe sold a million units. IBM originally expected the 5150 to sell 50-100,000 (it, and its clones, sold tens of millions.) Many of them sold in the tens of thousands instead. Go back to the mid-1960s and the first truly successful minicomputer, the PDP-8, sold about 50,000 units across its entire life. Today, a "wildly successful" personal computing device is something like the latest iPhone or Samsung flagship, and sells tens of millions in its first week out.
I submit that if you're trying to jumpstart a technological revolution you can't bootstrap the mass skill base you need fast enough with PDP-8s (cost: $140,000 in today's money: one per school, there's a form to sign up for your one hour per week session at the console) — you need Apple IIs ($5000 in today's money: one lab per school, many more schools have them) or BBC Model B's (about $1000 today: entire classrooms, plus lots of them in students' bedrooms), produced in multi-million volumes.
Yes, precisely. Back in the early 1970s, a lot of us predicted a workstation revolution when the price hit 5,000 (pounds or dollars), which could be used to run mini-computer applications, starting in the research arena. The point is that such people (and small businesses) could squeeze that out of existing budgets, but 10,000 needed a special budget. We were dead right, though we were completely wrong about how it would happen! I was expecting one of the Sun/Apollo copiers (who did hit that price point c. 1979) to deliver something usable, but they never did. As it was, the first workstations widely used for practical work were the Apple II, BBC Micro and IBM PC/XT, all of which were marginal in various respects, and the revolution happened when the Intel 386 systems arrived (including the IBM PC/AT). Incidentally, did you know that the original IBM PC wasn't even intended to be used as a free-standing system by IBM Galactic Headquarters, and IBM didn't abandon their belief that the future was PCs serving a mainframe until after the demise of the PS/2?
As you say, where speed is the objective, you need something like a BBC Micro out as fast as possible, followed a few years later by one of the Intel 386 Unixish systems. Setting up services on a modification of one of the better mainframe time-sharing systems (POP II, MTS, Titan, GUTS etc.), would be an alternative to the BBC Micro, but that would probably fall because of the lack of people to run the service.
I'm quite certain that you missed at least one, possibly two, points:-
1) Transport machines do something fundamentally different to animals. 2) There's a silent "Human" in "Artificial Intelligence".
When it comes to flinging around the instant sunshine - we have to understand that we're a generation that has cultural baggage surrounding the tolerance of casualties. AFAIK, none of us has experienced an existential war - but many of us know someone who has.
The generation that lived through the Second Big Mistake had seen civilian populations treated as legitimate targets, and killed by their tens and hundreds of thousands in air raids. Those who watched as the Luftwaffe bombed London, Glasgow, Liverpool, and Coventry; or even those who watched as the IJA tore through Nanking; knew full well that "they" were out to kill "our" families, and became quite happy to accept that "we" were going to make the rubble bounce in "their" cities, or even turn them into glass car parks. If the cost of "not shooting down a bomber" is a fission weapon used on a city, then a few comparatively-clean small-yield airbursts are a perfectly acceptable risk.
The military of that era were willing to balance far higher risks to life in order to gain performance - because every week that your front-line crews are flying against Fw.190 in Hawker Hurricanes instead of Hawker Typhoons, guarantees more dead pilots.
If you look at the crash and fatality rates for the Gloster Meteor in peacetime service, they were incredibly high; a fighter pilot a week, often more. Granted, they weren't completely daft - the Hawker Typhoon was grounded as "too risky to fly in peacetime" almost immediately after VE-Day.
As often said, the past is a foreign country; they do things differently there.
Would you give me something like 100_000 lines, split roughly equally between stuff developed by a 3rd party using "formal methods" but not supplying their documentation, and stuff my boss and I wrote in-house using the same naming conventions being ported from Ada83 to Ada95, and tested using supplied test data in 4 man months as "at least trying to write good code"?
I think it's true of most blogs and user groups - that is, women are less likely to comment than men are. I don't know the distribution of SF readers by sex, and perhaps men are still more likely than women to read SF.
Like 47 of the other 49 states, Wisconsin awards the electoral college votes on a winner-take-all basis. Had Clinton won the majority of the votes in Wisconsin, she'd have gotten all its electoral votes.
In 2012, there was a movement in states like Wisconsin to award electoral votes by congressional district, but that failed.
I suspect you're right; I tried to do something by sampling the present membership of Follycon (BSFA Eastercon 2018) and gave up when I realised that they'd sorted by forename + if exists midname + surname and I couldn't just count a sample to get a meaningful answer.
No, "bugfuck crazy" is strapping yourself to the outside of an AH-64 to get to a fight - search for "Jugroom Fort Apache rescue".
Somewhat less daft is that I've sat in the doorway with my feet on the skids while a Westland Scout was flying us across Dartmoor; it was a wide back door, and we had two of us on each side. Safety was provided by a strap across the doorway, that you hung your arms over...
These days, the RAF MERT and USAF Pedro teams have the luxury of using bigger helicopters (Chinook and Blackhawk respectively) - that wasn't always the case. The EXINT pod is little different to how armies did CASEVAC from the 1950s to 1990s. Look at the Bell H-13 in the opening sequence of MAS*H". They stuck panniers on the outside to carry a pair of stretchers...
Past 300, so...
BOOK COLLECTORS' OPPORTUNITY ALERT If your trading area features any Dollar General, Family Dollar, Dollar Tree, Big Lots, or their equivalent U.K. shops, walk the aisles until you find a book section, usually 5 or 10 feet of counter space loaded with Bibles, coloring books, teen romance etc. Spend fifteen minutes restacking and rotating the stock from one shelf to the next to spot titles of interest, should have a few gems in the clutter, likely reduced to $1.00 on an original thirty buck hardcover newer than 2012. I found Thomas Wolfe's "Back to Blood", Thomas Pynchon's "Inherent Vice", "Clean Energy Nation" by Congressman J. McNerney formerly a wind turbine developer, and promising translations from Dutch, Swedish and Chilean authors. Also biographies and autobiographies of politicians like H. Clinton, Sarkozy and Leon Panetta. Not bad for a trash wallow. And I enjoyed Inherent Vice so much I found a copy of the 2014 movie on dvd at the library, turns out it was produced by none other than the same Steve Mnuchin currently serving as Secretary of the Treasury. Besides getting three billion for the bank he sold, he also managed to find time funding a slew of movies like "Transformers." Talk about a renaissance man, right down to his Machiavellian foreclosure practices. About time I got something for my tax dollar, the Pynchon movie was virtually word for word from the book.
Slightly. It used to be considerably, but things have changed.
http://www.sfwa.org/2014/01/reads-science-fiction/
I don't know of any competent research on why males are more likely to speak up in public than women, but I do know the politically correct arguments are essentially just polemic. As with the reading demographic, that used not to be the case, but there is something fairly subtle going on; I suspect that I know some factors involved, but could well be wrong, so won't introduce a red herring.
I don't know the distribution of SF readers by sex, and perhaps men are still more likely than women to read SF.
The word I've heard is that women account for the majority of fiction consumed, by about a 60/40 split.
Some genre consumer bases are gendered more heavily than others. Genre romance is about 75% female consumers (but note that 25% is bought by men, and it accounts for roughly 50% of all fiction sales). "Hard SF", with its Cambellian overtones of two-fisted engineering stories, and it's spin-off, MilSF, is predominantly read by men — but it's still about a 65/35 split. Overall, SF is supposed to be about 55/45 male/female readers, and fantasy is an even split (skewing towards 75/25 once you get into paranormal romance, which started out as a subgenre within romance).
But anyway: the overt gender split among the commentariat on this blog isn't representative of the genre readership gender ratio, or even the ratio evident in people who contact me privately by email.
Since it is past 300, yes, he has. None of those easy solutions help much, as I can witness. The only thing that does is wintering in the sun, even if only for a couple of fortnights.
No, "bugfuck crazy" is strapping yourself to the outside of an AH-64 to get to a fight - search for "Jugroom Fort Apache rescue".
When I first heard of this, my reaction was, as near as I remember "These guys are totally bat guano crazy, and exactly the sort of people you want to come looking for you if you go missing in a war zone!"
no ( or maybe not ) and YES - that "silent H" is important.
Problem with AI at present - it's static - trapped inside a rooted-to-the spot casing. Humans & other animals are mobile/motile. What's the (last?) test for intelligence? Ask Prometheus, who gave humans a gift from the gods of fire.
Heh, I always liked the chopper boys from the era of widespread helicopter hunting in NZ from the 50s to late 70s. Rex Forester put out an amazing couple of books on the subject in the 80s, reissued together with ample photos in 2002.
From the early days of seeing how many carcasses you can jamb into an aircraft and still actually fly to the late days of jumping out of a helicopter to wrestle a stag to the ground and tie it up for live recovery, more than half the guys were frankly mad. I especially liked the guy who jumped out of a chopper in the mountains to grab a thar, only to have it kick and start them both tumbling towards a cliff. The pilot judged the angles, flew near the edge, and slipped to one side to move the blades out of the way while his shooter falls over the edge and inside the back, and then drastically tilts the other to stop him falling straight out the open door on the other side. Nuts.
I agree with someone else - half a dozen to a dozen, not more. Yes, I am including assembly, but not you'll probably have more than one assembly language, since it can be architecture-dependent.
C for varying from low-to-mid level (Unless someone wants to argue for PL/1, which I worked in at my first job, and understand it was created to build 360/MVS).
Higher level languages for different kinds of data processing. Hell, a modified version of COBOL (add pointers, and some additional loop controls, so someone like me didn't have to write PERFORM 1000-DUMMY-PARAGRAGH THROUGH 1000-DUMMY-PARAGRAPH-EXIT WHILE You would want something like this to, for example, process federal income tax returns for the country. And this one or two languages would have no trouble accessing d/bs.
A different one for GUI-writing and interactive code. FUCK JAVA!!!
And a utility program or two.
SQL isn't exactly a language. As I have read, it was created for managers to get quick reports, so that they didn't have to wait weeks for DP to cut and test the code. Besides, a relational d/b is NOT the Only answer. In fact, there's a good number of times that Ye Olde hierarchical is better and faster.
Anyone who suggests creating a "language" that secretaries can punch out columns to create a report (RPG) should be shot.
At a low level, I do include formatting. Most students don't, for example, indent, because they just want to get the code written for the class, and will never look at it again.
At the higher level, that's what I meant: structured, encapsulated code. Btw, back when I was working in C a lot, early nineties, a guy I had for an office mate for a while and I agreed: if a function was more than a 25 line screen or two, unless you were moving a ton of fields, it was too long, and you were trying to cram too much into it.
Companies almost never mandate things like error-handling style, or how/whether code should be modular, atc.
ARGH!!!
63-bit words on that damn Cyber....
Um, sorry, the computer revolution was in full swing within a year or two of the intro of the IBM PC. It was new at work in '84... or was that the end of '83?
Oh, and you've got an issue there: the PC-AT was 80286, not a 386. My first Intel box (my first computer having been a RadShack CoCo) was a 286, that I got for a good price in '87.
AAAAAAAAAAAAAAARRRRRRRRRRRRRGHGHGGHGHGHHGHGHGHHGHGHHGHGHGHHG!!!!!!!!!!!!!!!!!
And I've read that the author, Allman, said that the configuration was intended to be easy for the program to interpret, not for the convenience of puny humans....
Read a book a friend loaned me? Gave me? a bunch of years ago, autobiographical, written by a slick pilot about 'Nam.
You want GENUINELY certifiable?
Re: 'If there is such a difference then it's not AI.'
Are you implying that any human-begat AI is merely an amplified cartoon/truncated human?
Yes, but the 286 was pretty much useless in terms of memory addressing IIRC — things only really took off with the 386 (true 32-bit address space as well as backward-compatability modes for running 8086 code). Interestingly, the 386 was implemented with only about 300,000 transistors as I recall—compare to the modern billion-plus component chips? And at one time I worked along with about 15 other folks on a single 386 running UNIX. With 32Mb of RAM and a smart serial terminal i/o box driving a bunch of Wyse terminals it was quite capable of pretending to be a mid-1980s minicomputer.
Naah, THIS guy was certifiable.
Link goes to his obituary because the book is VERY out-of-print; but the obit misses out a lot of the juicy bits. And no, he didn't make just five ejections: he was Martin-Baker's ejector seat test dummy for about a decade (and an Olympic parachutist).
That isn't what was intended to be "the sendmail configuration language", that was intended to be the "somewhat human-readable representation of the rewrite language that sendmail runs", with another layer on top, generating that, that humans interact with. This, as it turns out, eventually materialised as "sendmail.mc" quite a few years later. Sadly, I actually find (correction, found; I am sufficiently out of experience with either these days) sendmail.cf more readable than the slightly more machine-hostile version.
Other languages that had a similar thing was Lisp (what's now considered lisp was the "structured expressions", intended for the interpreter/compiler to deal with, converted from a more human-friendly syntax (m expressions) that really never took off).
The first computer to sell one million units was Commodore's VIC-20, which had 5K, (of which about 3.5 K was available) for the built-in ROM-BASIC. They sold for around U.S. $300.00, and I'm guessing that Miriam's friend could find one on Ebay and ship it across timelines to the Commonwealth. The VIC-20 had a tape drove, and Commodore eventually developed a modem that sold for U.S. $99. There was also an add-on memory device which had 40 K. This was in 1981.
The VIC-20 was replaced by the venerable and much-beloved Commodore-64, at around U.S. $600.00. Since Miriam came from the U.S., this is probably what she's thinking about in terms of how to evolve a computing culture.
If Miriam really knows the history of computing, however, she'll go the PARC-Xerox route and the ROM will contain something more like Smalltalk and less like BASIC. (I think the first PARC computers had around 8K.)
I worked on some software for those, once, and their statistics on spinal damage during ejection were horrifying - until you realise what the alternative was. But, yes, HE was certifiable.
Re: '... the commonwealth gets to borrow 75 years worth of science, chemistry, semiconductor process technology, physics, and so on ad nauseam.'
Not only borrow but sift through to identify the best kernels. Most SF alt-history stories I've read seem to require that the alt-world go through every single step and misstep in development. Why would anyone sane waste their time and resources recapitulating the coal era (incl. smog, lung disease, CO2 GW, etc.) when they can go direct to clean renewables. Also, once you've identified your likeliest energy source, you can work backwards to design the most efficient, lowest cost to build and operate infrastructure. I understand why most of our infrastructure was built directly on or next to older infrastructure, but that's what makes changing it so much more expensive therefore politically difficult.
Tech leap-frogging is not that far-fetched, just look at China, Saudi Arabia and a number of African countries who've opened trade with China and that are leapfrogging and cherry-picking their tech at a much lower cost than the West/US. Ditto leap-frogging of laptops vs. smartphones across first- vs. third-world geographies.
I worked in '286 *nix for a long time. It had a couple of slight problems with its memory management that caused problems -- including one particular run of processors (from AMD) not handling a segment-not-present case, meaning it couldn't handle segment swapping.
But the biggest problem it really had was that C and *nix were taking over when it came out, and C (despite uSoft's attempts) really wasn't set up to handle non-uniform pointer sizes, or complex pointer types. (You can think of the '286 as an attempt to fit Pascal and Ada -- stack, data, and heap in their own segments, and other objects could be in their own segments. Unfortunately, setting a segment register took a painful number of clock cycles.)
The '386 could have been very interesting if they'd used segments on top of pages, instead of mapping on everything on top of a flat address space. (This would have allowed you to have multiple segments, each 4GBytes, which could have had their own permissions and page mapping tables.) Of course, they would still have needed to make the segment loading faster.
The CDC 6600, and its brethren (6400, 6500, 6700, 7600) and descendents (Cyber 70, Cyber 170) central processors had 60-bit words. The peripheral processors used 12-bit words. (The PPs were descended from and VERY similar to the CDC 160A.)
In the eighties when the second wave of personal computers were developed (the 16/32 bit successors to the 8-bit first wave) nobody picked x86 processors apart from IBM. Apple Mac - 68000, Commodore Amiga - 68000, Atari ST - 68000, Acorn Archimedes - developed ARM. Of these (x86, 68k, ARM) the most widely used now is ARM and probably the best to 'borrow' to bootstrap a CPU industry.
Both sides routinely do that.
Look at Eddie Bernice Johnson's (D-TX) district in Dallas. If you can find the older one, it is especially telling. It had a fractal border, having been VERY carefully crafted to create a safe district for a female Democrat of color who was not remotely competent or ethical.
I had the misfortune to live in that district for several years.
There was also the redraw of the Dallas City Council districts, when Judge "Barefoot" Sanders ordered them to abandon at-large election and do a 15-1 single member district plan. As one of the local critics pointed out, it was easy to draw districts that balanced the demographics nicely against the total city, but it was IMPOSSIBLE to do that AND protect the incumbent City Councilcritters.
You have misunderstood what I was saying. Yes, you are right about the PC/AT - the c. 1984 developments were the game-changer, but the 386 was also important because it enabled decent operating systems and the full range of mini-computer applications. I don't know where you are posting from but, in the early 1980s, the use of personal computers was much higher in the UK than anywhere else and, if I recall, was several times as much as in the USA.
The revolution actually started much earlier. By 1977, $5,000 desktops were widespread, but all were set up to run a single application, and their users did not program them. Most of the practical uses of the early IBM PCs were the same (VisiCalc and others). Hobbyist computers took off about 1977, too, but were used almost entirely by 'geeks' and did not impinge much on the general public. That was important, because many of those became serious programmers, later.
An equally serious point was that most of the early personal computers encouraged people to program in BASIC. Most computer scientists and IT professionals dreaded teaching or employing BASIC programmers, as the first thing one had to do was to get them to accept that most of what they had learnt was positively harmful. BBC Basic was slightly better and, more importantly, the BBC Micro was designed to make data capture and machine control easy - so a lot of hobbyists learnt about that.
I was pretty closely involved with IBM at the time. Most of the early IBM PCs tried out by professionals were rejected as a pile of shit or unsuitable for the desired uses, except for the ones used to run a canned application. The IBM PC/AT, MS-DOS 3, Digital Research and others changed that, including by providing compilers for the languages used on mini-computers, but not until about 1984. Inter alia, a wayward program could and did trash the filing system under MS-DOS 2, because a critical table was kept in memory.
IBM also made a 68k based computer (https://en.wikipedia.org/wiki/IBM_System_9000) at the same time as their original PC.
Re: Gender split - SF/F readership
Just read the article you linked to. Not knowing how the survey was worded, it's difficult to tell what the results actually mean. As you suggest, one issue is how consumers and retailers (not just publishers) label genres. Depending on the bookstore, horror is sometimes in the same section as SF while older/classic horror, SF and spec-fic (Shelley, Welles, Huxley) are in the Literature section. Also, larger book stores may have a larger number of discrete sections and split out Fantasy vs. SciFi vs. YA (whose themes skew toward dystopias, zombies, vampires, etc.).
I imagine that the mil-SF readership gender profile/incidence changed considerably during the years that LMBujold was pumping out her Vor universe series at the rate of about one book per year.
"I don't know of any competent research on why males are more likely to speak up in public than women, but I do know the politically correct arguments are essentially just polemic"
That seems a very odd claim to me. When I taught at university I was, of course, concerned that all the student feel equally free to participate in discussions. As someone who tries to hire & retain the best staff, I have of course worried about this.
And the research is both solidly empirical, very clear, and unequivocal. Women get interrupted far more, and have their ideas ignored more and taken up by the group less. Topic changes by women are more likely than be ignored, their suggestions are less likely to be followed. Their ideas get challenged more, but more than that they get ignored significantly more, and are more likely to be treated negatively if they are criticising others views. Males who speak a lot are rated more highly by their peers, women less highly.
These effects have been measured in classrooms at the primary, secondary and university level, in business meetings, in private conversations. The effects are cross-culturally robust across various English speaking countries and regions. The effects are robust across racial and demographic boundaries.
So we've an extraordinarily well ground empirical fact that men get respected more for speaking up in group settings than women.
The jump from women getting more negative feedback for speaking up to "So this is why they speak up less" is not as well studied. I assume because most people think it's obvious.
Another reason ARM is a good choice is that they provide designs and architecture licenses rather than chips. Apple (A-series) and Qualcom (snapdragon) have both designed ARM variants and fabs like Samsung and TSMC have both been able to build them so there probably aren't any secret IP issues or backdoors to worry about.
You only have to watch any kind of panel discussion on TV / Youtube to see this happening.
Re: Gender bias in the classroom
Lots and lots of research!
Have also seen this myself at a parents' night at prog's school. This school is private, pushes academics, and is on-paper very forward thinking and 'enlightened'. One new teacher showed the same gender bias toward the parents as to the kids in his classroom during the Q&A part of the meeting. When called on it, he fiercely denied it but too many other parents also had noticed and complained. His contract was not renewed. The regular, public (US version) system doesn't have the ability to screen and fire such teachers so this type of social conditioning is likely to go on and be absorbed as this is the way that the world works.
http://www.edchange.org/multicultural/papers/genderbias.html
What did the Jesuits say about 'give me a child for 7 years'?
I think that you have misunderstood me. I am aware of those observations, and have looked at some of the research. But I was referring to the reasons underlying those phenomena, which are massively unclear. Yes, there ARE environments where it can reasonably be said to be 'the patriarchy', but it also occurs under circumstances where that does not apply. In particular, the primary school data makes one wonder whether it is part of the gender differences in the distribution of character traits.
Yes, it happens. And I have seen the converse, too. But the point I was making is also implied by that paper:
"This socialization of femininity begins much earlier than the middle grades. At very early ages, girls begin defining their femininities in relation to boys. ... Reay's research shows that each of the groups of girls defined their own femininities in relation to boys."
But WHY is that? And if it is a taught phenomenon, WHO does the teaching? Because there is evidence that starts before school, even in communities and households where the women are as independent as the men. I have some personal experience of this, and have some speculations on what might be going on.
Check This book out. Note that I'm not an expert, I just listen to the news and have google.
Dude, I remember this first-hand. I was there.
People, please don't try to oldfarsplain the 80s and 90s in computing to someone over 50?
"Tip jets on gyrodynes are fed by compressed air through the drive shaft; you're not pumping fuel down the blades"
The Fairey Rotodyne had a pair of auxiliary compressors driven mechanically from the main turboprops which fed compressed air down the rotor blades to combustion chambers in the tips. Fuel and ignition were delivered up through the centre of the rotor head and then down the blades. That was easy, because it only needs little pipes; it was the high-volume air ducting that complicated things. The tip jets were actual jet engines, not just compressed-air nozzles; the difference was that the rotating bits which are normally inside the engine were in this case a long way away, so the actual engine bit could be much smaller.
Details are here: http://www.flightglobal.com/pdfarchive/1957,pageid_56.html including what alternative rotor drive systems they considered and why they chose what they did.
The tip jet idea itself is considerably older and was apparently originated by Wittgenstein as a means of driving ordinary propellers, before he got into philosophy.
Re: Gender discrimination - classroom
How about social inertia via generational seepage?
Human populations are not all the same age at the same time: several generations (with their generational biases) all act on the newest addition to the human race. My thinking is that this is where a longitudinal study vs. a one-shot cross-sectional, one moment in time survey would help clarify what's going on and identify/weight the key sources/factors. (Plural 'factors', not just one factor.)
Oh, and the 'social inertia' thing is also gender-biased because women's issues are still categorized/dismissed as trivial or secondary, e.g. birth control, abortion, wage parity, rape, even checking for heart attacks in the ER! therefore amplifying the inertia. How to change this: don't reward sexism, and if possible, punish-by-wallet. (Hollywood is currently learning this lesson, 60 years post-Friedan.)
Of course that opens you up to the important questions.
Amiga or ST?
:)
but the 286 was pretty much useless in terms of memory addressing
Which reminds me of the time I was in a briefing on the Munitions List and learned that PC-ATs (80286) were on it. When asked why, the briefer explained that it was because they could be used to design nuclear weapons.
https://en.wikipedia.org/wiki/United_States_Munitions_List
As an aside, I was able to use my 68k* based personal computer to do useful physics calculations well into the 90s.
Wouldn't be seen dead with less than 16gb RAM and 512 CUDA cores these days but the difference in capability between no computer and any computer is far greater than the difference between an 8MHz 68k and something modern.
*Not saying which one until I know which side OGH was on :)
How about a timeline in which Commodore allowed Borland to develop the "Turbo" softwares development environments/compilers for the Amiga? That would have been an interesting future!
With 32Mb of RAM and a smart serial terminal i/o box driving a bunch of Wyse terminals it was quite capable of pretending to be a mid-1980s minicomputer.
Well. It should. Because it was.
Yes, social inertia is possible, but a reason to doubt that it is the whole story is the consistency of the phenomenon. And remember that early conditioning is female-dominated. My view remains that, if you want to solve a problem, you first have to understand it.
"Oh, and the 'social inertia' thing is also gender-biased because women's issues are still categorized/dismissed as trivial or secondary, e.g. birth control, abortion, wage parity, rape, even checking for heart attacks in the ER! therefore amplifying the inertia."
You ARE aware that exactly the same can be said about men's issues, such as prostate cancer (compare with breast cancer) and suicide? Gender differences are not a simple issue.
The one I liked most was the ban on exporting mathematical cryptology to the USSR. Kolmogorov was still alive then, too :-)
Add to the list of languages required something like VHDL. Helps a lot if you can simulate your CPU designs before converting them to hardware, though of course you need an existing platform to run the simulation on. Several of the engineers who'd stayed at Acorn after the ARM spin-out were proficient, Acorn designed some co-processors and support chips for the STB and NC projects using it, and a number of my colleagues who had joined later dabbled.
Neither: Amstrad PCW.
I have some personal experience of this, and have some speculations on what might be going on.
I'd be interested in those speculations. Privately, if you like. my.name@gmail.com will find me…
Don't underestimate how early it begins (studies of how parents touch/look at babies shows different treatment based on perceived gender all the way back to when they're first presented after birth), or peer-group enforcement: masculine behaviour is enforced by peer ridicule of non-masculine behaviour in males, for example. And there are second-order effects. For example, pushback against feminism can come from women who are either strongly invested in the current social hierarchy (and see it as implying loss of relative advantage for themselves), or because of cognitive dissonance (acting to address issues of disempowerment require first admitting that one has been systematically disempowered, which can be unpleasant).
Then there are other really odd interactions with different social hierarchies. Consider how expressions of white western racism against muslims in the west often take the shape of attacks by men against hijab-wearing women, or legislation to enforce female dress codes by "banning the burkha" which are justified in terms of feminism. White supremacism have a strong anti-feminist streak and are obsessively preoccupied with reproductive enslavement of white women. 1930s/40s Nazis were obsessed with the fear of Jewish men inter-marrying with aryan women. And so on.
I am not doing so. My speculations are largely based on the observations of girl children of households with two academic parents, including two daughters of my own with very different characters, especially before the age of 3.
Verilog!
I spent a decade working alongside a bunch of hardware developers. It was interesting to hear one mention that for a while, the choice between the two languages was geographically biased (i.e. Europeans tended to use one, Americans the other). I found both languages frustrating, because they didn’t really do abstraction very well. It was like a trip back to the 80s in language style...
And in an ironic turn, they’re now moving away from SystemVerilog towards coding up their designs in C/C++ and using high-level synthesis tools to achieve better QoR...
Yay ;)
"My view remains that, if you want to solve a problem, you first have to understand it."
If by "understand it" you mean "understand the cause of the problem", then I think that's false. Both in the general case and in this specific one.
General Case:
We didn't need to understand the causal link between tobacco and cancer, or between lead in petrol and paint and brain development, in order to act to solve the problems. Despite the tobacco and oil companies best attempts to claim that we did need to. Some global warming denialist science retreats to the same defense "Dynamic feedbacks! Very complex system! Can't possibly suggest simple solutions if we don't understand the complex system!".
We don't understand how cerebral assymetry and left-handedness work. (Well, we didn't 20 years ago when I read the research) That didn't stop us providing the ability to easily use mice left-handed, or providing young kids with left-handed scissors at school (or ones used easily by either hand - I don't get the trend to "handed-ness" in scissors).
Obviously work done in complete ignorance is likely to fail. And obviously the more we know the more likely we are to have better-focussed solutions. And not all problems are equal - engineering issues tend to have simpler causal relationships.
But typically, once one steps outside the realm of engineering, solutions don't require full understanding of the cause of the problem.
Specific Case:
I'm sure you are aware that there are things that can be done by every teacher, every senior engineer running discussions, and every manager to work to ensure the women speaking up aren't interrupted, that their ideas are being treated with respect, to overcome the bias pretty much everyone in our culture has to ask men questions more than women and not notice we're doing so, etc. Not to mention a zillion details of modelling with young kids.
Maybe that won't "solve the problem" in some perfect sense. But given sufficient time and effort it seems very likely to go a very long way towards it.
pushback against feminism can come from women who are either strongly invested in the current social hierarchy (and see it as implying loss of relative advantage for themselves) The Madwoman form Grantham being a very good example of this. Which reminds me, given the hooh-hah over "Inappropriate" behaviour by MP's irrespecive of party; 1: How did slimy Cecil Piock get away with it, even then? & 2: Even so, there's a lot of fuss - just slap'em round the chops with a wet fish ( or eqivalent) -works almost evert time ....
Bugger Slimy Cecil PRICK
Looked at that - I remember one of the amazing sectional drawings in "The Eagle" of said aircraft ... But I also noticed that there was an underside view of a Mk1 (?) EE LIghtning a couple of pages down. Now there was a scary ( to its opponents) aircraft.
Modern Renewables are high-apex engineering. Enormous supply nets required. If you want to skip coal the most sensible path is something like "Hydro + demograpic transition early so you never have to use anything else" (Copper IUDs can be made and safely inserted/removed at any tech level if you know how, so contraception is really portable) or Hydro - > nuclear, because reactors can be built on top of a much smaller industrial eco-system.
Not just to its opponents, going by some of the pilots' reminiscences you can find about it...
But are the enormous supply nets "required"? Or is it just that we've only started getting into renewables in a big way after the obsession with doing everything in that idiotic fashion has become thoroughly entrenched, and this makes it difficult to imagine doing it differently?
A wind generator, after all, is the same ancient technology as a hydro station; the bit that isn't ancient which is common to both is still well over a century old. The main constituent of a solar panel occurs in quantity on innumerable beaches. A tidal power station is simply a hydro station using salt water, and the idea is similarly ancient. Solar powered steam engines are nearly as old as coal powered ones. There is no actual need to buy this stuff from China, even if we have got the habit of doing that in this timeline.
It seems to me the biggest obstacle in "skipping coal" (or oil) is much the same as in getting off it: batteries. Renewables are intermittent, and therefore require either massive storage or massive excess generation capacity; mobile applications call for high energy density storage. (Probably the most significant such application is in agricultural and related areas; railways and local goods distribution can get by using lead-acid or NiFe, but a battery-powered tractor is kind of tricky, and a battery-powered chainsaw is probably only going to be usable by Schwarzenegger.)
There's also the bootstrap aspect, in that you need to start by reducing large amounts of iron and copper oxides, which pretty much means you have to have carbon.
Still, though, the Commonwealth is not really in a skipping-coal kind of situation; rather, they are already happily using it. But compared to us, they have far more remaining planetary capacity to absorb the consequences, and are in a much better position to take avoiding action before the situation becomes serious.
"I don't get the trend to "handed-ness" in scissors"
I don't get the trend to chirality in mice either. I'd been happily using both scissors and mice for years, with my left hand, without any difficulty, before I discovered that left-handed versions even existed. I still don't recall ever having seen a left-handed version of either.
Come to that, I'm not sure what a left-handed mouse even is. Chiral trackballs, certainly; I much prefer a trackball to a mouse, and it has long pissed me off that once you cross out all the trackballs that are emphatically shaped to fit only the right hand, there's not much choice left. But mice - they are quite the opposite. Practically all mice are bilaterally symmetrical.
(As for "left-handed" mouse configuration settings, as far as I can see all they do is swap all the buttons around, which isn't quite as bad as turning the steering wheel anticlockwise to go right but isn't far off it for daftness. Maybe writing mouse configuration applets is the programming shop equivalent of sending the apprentice for a left-handed screwdriver.)
You want to see 'trivialised'? Try being a male rape survivor.
Somewhat less daft is that I've sat in the doorway with my feet on the skids while a Westland Scout was flying us across Dartmoor; it was a wide back door, and we had two of us on each side. Safety was provided by a strap across the doorway, that you hung your arms over...
I've done that a few times in a UH-1; just looped a rope through one of the cargo D-rings on the floor & ran it around your arm.
For a while Special Forces at Fort Bragg were doing HALO jumps from the AH-64. They'd lay on their stomachs draped over the wing & slid off the back when they were ready to go.
... until the brass caught wind of what they were doing and put a stop to it.
Read a book a friend loaned me? Gave me? a bunch of years ago, autobiographical, written by a slick pilot about 'Nam.
You want GENUINELY certifiable?
Would that have been Robert Mason's "Chickenhawk"?
I'm just-about right-handed ( I could write with either hand at one point, but I shudder to think what it would look like, now. But I use a computer-mouse left-handed, but with the the buttons in the normal configuration. Left-handed for preference with a sword, right-handed ( Because, like 95+% of the population I'm right-eyed ) with a bow or rifle. On the only two occasions I've used a "pistol" I found that either left-hand or double grip was better than right. Um.
Yes, well, the current Westminster hothouse furore seems to be almost entirely about men abusing their power over women, but there are other cases, of course. Incidentally, I've seen this sort of febrile "moral panic" before - 1963-4 "Profumo" (etc), but that was almost-exclusively confined to the tories & contributed to Wislon winning the next election. THIS time, it appears that the scandal/panic is more widespread.
I forgot to include the necessary reference sorry about that.
Wind turbine installations on land require heavy-lift mobile cranes (50 tonne to 200 tonne dead lift to 50m plus heights over base) which are not a trivial item to design, construct and operate, never mind the road construction needed to get them to site in remote areas. Sea-based turbine installations require ships with similar crane capabilities, again not trivial to develop and build.
Silicon solar cells require a lot of quite sophisticated plant to make the silicon substrate, even amorphous panels which are less efficient than crystalline. The chemical soup that goes on top is not trivial to source given the required purity and consistency for large-scale production of good-quality solar panels.
Time was, during the second generation of nuclear plant construction in the early 1970s that pretty much everything was constructed on-site with a lot of welding of "small" segments (5 tonne to 20 tonne) to assemble larger structures such as the reactor vessel and steam generators. Nowadays it is assumed that all large reactor components up to several hundred tonnes in weight will be built and tested in a foundry or factory and then transported to site for installation. These parts are even bigger than the second-generation units since the finishes reactors will typically produce over 1GWe compared to the 2nd gen plants that are in the 500MWe range.
It took the development of quite complex transportation options to make moving a 250-tonne steam generator by sea, rail and road to site as well as some very large cranes to put them in place in the reactor building.
Which reminds me ... What's the population-load & tech level of the rest of the planet in the Commonwealth t-line? Same as, I hate to say it, but a failing of the "Clan" series & this on (IMHO) is that "nowhere exists" outside the para-USA of the series, in the developed world, at any rate... In other words the usual failing of the USA to realise that there are other places with other customs & history. Um. Or am I missing something?
Wind turbine installations on land require heavy-lift mobile cranes (50 tonne to 200 tonne dead lift to 50m plus heights over base) which are not a trivial item to design, construct and operate,
An observation: the railways developed heavy mobile cranes back in the 19th century (because derailments). One might speculate about a wind farm designed around something not too unlike a railway switchyard — lots of parallel tracks running through the farm area, spaced far enough apart that the rows of turbines between the tracks don't interfere too much. Crane access to the turbines made relatively easy. It'd take a bit more ground work than the regular service roads, but the mobile cranes could be decidedly lower tech.
Look into higher end or gaming mice. Basically as soon as you add extra buttons like thumb buttons, which are very very useful, you rapidly start to optimise the shape of the mouse to fit a hand, and it is always optimised for a right hand.
Some of the newer companies like Razer are releasing dedicated left hand mice, but then you run into the second problem. There are two types of left handed people - those that swap button orientation so the index finger is primary click, and those that retain button orientation so that primary click is on the left and done with the middle finger. So that reduces the market even further, since many mice have primary button config locked at a hardware level - you can override it but many apps will ignore the override and get data direct from the mouse.
I used to be ambidextrous with mice for gaming, but ithere are very few good symmetrical mice with sidebuttons available, even now.
They don't need modern renewables. Much of the tech for renewables was invented and made in the early 20th century, if not the 19th. Sure, it wasn't as perfect and efficient as the 21st century equipment, but as the various hydro schemes in Scotland from the 1930's and 40's show, you can do all you need with relatively primitive equipment and skilled labour.
The "big thing" is telescopic cranes, ones that can reach a hundred metres and more vertically with a 100-tonne suspended load. Train-mounted cranes for handling derailments are short and squat and don't lift much over 50 tonnes anyway (steam locomotives and waggons were quite small for a long time).
The alternative for big wind turbines in the 5MW dataplate class (producing on average 1.5MW) is to install a tower crane base separately for a high-lift crane the way they do for high-rise buildings and that's expensive if you need one for each turbine tower since they're widely separated to avoid wind shadow (typically 100 metres and more).
It's interesting to see pictures of the Vogtle and Summer reactor builds which require short but heavy cranes for lifting 500-tonne reactor parts into place -- the biggest crane on each site runs on a circular track between the two reactors rather than having a big semi-mobile crane or two fixed cranes, one for each reactor.
Hydro is the obvious starting point for any plan for cross time uplift, because, well, you could probably manage to build dams with social organization equal to the Ancient Egyptians or any of the more functional empires and a handful of engineers, and that gets you cheap power. If your population is low enough that you can meet demand from that alone, then that is it, you are done with the powersupply side of the equation. But if it is not, well.. People and nations tried very damn hard to make solar/wind work as a grid supply, and mostly, they still just do not work. They are certainly not going to work out if, for example, you are trying to expand abundant power from the alt-earth equivalent of Norway, without a global economy measured in the billions of industrialized citizens. Wind turbines wings are enormously advanced composite-materials engineering, let alone what goes into their gearboxes, and the economic production of worth-while solar cells require enormous volume and high mastery of industrial chemistry. Copying a heavy water reactor requires you to be able to machine steel to decent tolerances.
Admittedly off-topic, but... Greg is after an LED replacement for a 100W bayonet bulb. So was I. But the hunt is over: try LED Hut's (ledhut.co.uk) 11W B22 (Bayonet) Filament LED Bulb - Frosted (3000K). I've now replaced all my old 100W bulbs with these.
You're going to have to expand on 'tried hard to make solar/ wind work as a grid supply but it doesn't'* because here in Scotland we're getting lots of lovely grid wind power from lots of wind turbines.
*paraphrased
That's funny, I've gotten my LED bulbs from a combination of B&Q and Tesco, obviously being careful as to the colour temperature and CRI claims, and it includes at least one 100W equivalent.
Hence the "Mostly". With a global - highly developed world- effort we have gotten to the point where if you happen to be geographically blessed with an unusually good resource and means of storage, it is workable. That is not going to be very helpful the earth-3 equivalent of Poland, to pick a country with the potential to have a whole lot of people in it but bugger-all renewable energy flows to harvest.
Okay, that makes more sense. So you are saying some should go straight out for nuclear, which would make sense if they don't have much renewables.
It would also make sense to impose a strict building insulation code in Earth 3, given the amount of energy that is used for space heating.
GOT IT - thanks.
My experience has been that every off-brand and own-brand LED bulb I have bought in the past several years has failed drastically short of its claimed life. I don't think any have lasted more than 5000 hours and most fall short of even that. So I'm now buying Philips because they might actually last the 15000 hours+ it says on the pack and they will be around to honour the warranty if they don't. (I have had a refund from Amazon for off-brand bulbs that failed inside a year when their claimed 24/7 life was more than three years).
Monocultures can be very vulnerable to pathogens.
Actually, the railways usually solved that sort of problem with a single long feeder-line, with a run-rund loop at the end, & lots of short stub sidings for the individual "Outlets" ( turbines in this case) Let's see if the formatting will allow it:
\ \ \ \ \ \ \ \ \ \ \ \ ______/
That solves the crane. It does nothing for the fact that the wings of a modern turbine have blades fifty meters long made of space age composites to aeronautical tolerances, all mounted on monstrous gear boxes with self-lubricating gears taller than you are. They are some of the highest expressions of the art of mechanical engineering ever made, and if you try to start a production of them in a place without a long ass tradition of industry, it wont go all that well. China has major problems building these things to spec, despite the fact that the industry as a whole does not even bother to patent things.
Why bother with high tech wind farms? As has been pointed out, water can be done far more low tech and will be more reliable. Somewhere in deepest Derbyshire, I know of an old water mill where the wheel has been hooked up to a microgenerator (made of old tractor bits, and a control board done by the farmers mate at Nottingham uni). Sits feeding into the grid for about 2 decades now.
You can also get a lot of solar energy by using water heating panels-one of my childhood memories was of Dad making some of these from old plumbing and a scrap water tank. Might only raise temperature by ten degrees, but use that as feedwater for your hot water system/boiler and it will add up.
That's like saying "All cars currently existing are Ferraris, therefore the only kind of car it is possible to build is a Ferrari", but while you are arguing about how to build Ferraris, Greg has got in his Land Rover and is already half way there :)
Ha, yes, I remember making them from plastic bags when I was a kid - a transparent bag for the front and a black one for the back, welded together in a zig-zag pattern. (Using solar energy for the welding, too, by means of a magnifying glass.) Worked, but what I didn't manage to do was make a supporting enclosure that could handle the hydrostatic effects.
... That comparison would hold if Ferraris got 10 times the mile per gallon of a normal car.
Windmills are not marvels of engineering because they are penile extenders for the engineers designing them or the workers building them. (although, I have met some of these people, and they do take a whole lot of pride in their work.) They are marvels because that brings down the cost per kilowatt hour of electricity produced, and extends how long they last both between necessary maintenance, and before the end of useful functionality. More primitive designs are massively less economically viable. Talking order of magnitude here. And that matters, if you are building out infrastructure.
Re: Gender differences ... 'My view ... if you want to solve a problem, you first have to understand it.'
Understand it like quantum mechanics where because the math looks clean and the outcome matches predictions, it's okay? Or, understand it down in your bones?
Based on the bits of research I've read about gender differences as well as personal experience, I'd say that we're nowhere near either. Part of this is due to cultural and social factors including varying degrees of inertia, part to not even being aware of what the clues/signs are that we should be looking at, part to teasing out individual differences (genetic, epigenetic, developmental, environmental, etc.) against varied cultural backgrounds, etc.
IMO, the biggest stumbling block is assuming that there may be only one, two or six factors - ever, and for all time - whereas it could be hundreds at different strengths (including zero), plus permutations and combinations. So, yeah - I'm completely on board for more research on this.
That said - regardless of what gender differences there may be, I think that there is one overarching notion that could be agreed upon: they're all humans. And as humans, each has a right to self-respect/autonomy, health, education, etc. according to their individual needs and abilities provided these needs and abilities do not harm other humans. (May sound corny but I really mean and believe it.) In some ways 'human' vs. 'individual' is also a scale problem therefore the need to look at different factors operating at different levels/scales.
BTW, I fully expect individual differences, therefore do not expect myself (or anyone) to be able to predict someone else's actions or reactions 100% across all situations. Seriously - I'd have to be able to monitor their every breath awake or asleep (dreams) in order to get all the requisite data to make such a prediction and only if they never interacted with anything/anyone ever. Doubt I could even make such a prediction about myself and I've been living inside my head all my life.
AI as a data gathering and analysis tool could come in handy in identifying factors that we're currently unaware of provided the programmers/developers of such AI allow for greater scope/breadth in terms of type of data gathered ... plus who knows what other yet to be determined things.
Re: Energy
The research is growing so fast across an increasing number of forms of energy production and techniques that so long as research funding isn't cut off or existing utilities don't kill these new ventures, and people get off the 'but we must have only one universal energy form' dogma bandwagon, most of the planet should be able to source its own energy locally and affordably.
For example, here's another new energy source: water evaporation.
https://www.sciencedaily.com/releases/2017/09/170926125154.htm
Excerpt:
'In the first evaluation of evaporation as a renewable energy source, researchers at Columbia University find that U.S. lakes and reservoirs could generate 325 gigawatts of power, nearly 70 percent of what the United States currently produces.
Though still limited to experiments in the lab, evaporation-harvested power could in principle be made on demand, day or night, overcoming the intermittency problems plaguing solar and wind energy. The researchers' calculations are outlined in the Sept. issue of Nature Communications.'
Meanwhile, already established alternate energy source production techniques continue to be improved:
https://www.sciencedaily.com/releases/2017/08/170828124534.htm
Excerpt:
'Researchers have developed a simple, low-cost, and environmentally sound method for fabricating a highly-efficient selective solar absorber (SSA) to convert sunlight into heat for energy-related applications. The team used a 'dip and dry' approach whereby strips coated with a reactive metal are dipped into a solution containing ions of a less reactive metal to create plasmonic-nanoparticle-coated foils that perform as well or better than existing SSAs, regardless of the sun's angle.'
Geometry
Can't find the paper - it's about PV design and using improved geometry to develop a surface texture that bounces rays back and forth in order for more of the rays to actually touch more surface area thereby extract more energy. (Sorta like origami.)
Here's an older (2012) related study:
https://phys.org/news/2012-04-wrinkles-boost-power-solar-panels.html
Excerpt:
'Jong Bok Kim, a postdoctoral researcher in chemical and biological engineering and the paper's lead author, explained in the Nature Photonics paper that the folds on the surface of the panels channel light waves through the material in much the same way that canals guide water through farmland. By curving the light through the material, the researchers essentially trap the light inside the photovoltaic material for a longer time, which leads to greater absorption of light and generation of energy.
"I expected that it would increase the photocurrent because the folded surface is quite similar to the morphology of leaves, a natural system with high light harvesting efficiency," said Kim, a postdoctoral researcher in chemical and biological engineering. "However, when I actually constructed solar cells on top of the folded surface, its effect was better than my expectations."
Although the technique results in an overall increase in efficiency, the results were particularly significant at the red side of the light spectrum, which has the longest wavelengths of visible light. The efficiency of conventional solar panels drops off radically as light's wavelength increases, and almost no light is absorbed as the spectrum approaches the infrared. But the folding technique increased absorption at this end of the spectrum by roughly 600 percent, the researchers found.'
Read more at: https://phys.org/news/2012-04-wrinkles-boost-power-solar-panels.html#jCp
"That's funny, I've gotten my LED bulbs from a combination of B&Q and Tesco"
Well, yes, I'd tried those. No good if you need the same degree of light omni-directionality as normal incandescent ones. But if they work for you...
The average tenure of a UK Prime Minister isn't ten years it is just under four and a half years. Since 4th April 1721 (Walpole appointed first Lord of the Treasury for the second time) there have been 54 Prime Ministers by the standard count in 296.59 years. This works out at 5.49 years. The actual period is slightly less due to occasional vacancies and a couple of abortive appointments.
I think the fundamental problem is embedded in the word "ambidextrous." If it read "ambisinistrous," we'd have a lot fewer problems.
Here's the deal: if you're right-handed, you grow up in a world that's biased in your favor. If your right hand is disabled, you're disabled. If you're left-handed (like me and many others), you grow up in a world that's biased against you, and you learn to do a variety of disparate things either by adapting right-handed tech to your left hand (me with mice) or your "off-hand" (I use scissors right-handed). Left handers are trained by the civilizational bias against us to be more versatile and less handed than are right handers. Like Leonardo, I can even write right-handed a little, although it's easier for me to do so backwards.
I suspect the reason we get things like "left-handed scissors" is that right-handers think it's a good way to reverse the bias and exploit an untapped market of left-handers. While they're appreciated, they're seldom needed, simply because most of us south-paws are used to coping with right-handed tools. While there are certainly cases where left-handed tools are necessary (for me, it was a fencing epee), in the most part they aren't.
This is what I meant about the problem with the word "ambidextrous." It has the sense of having two RIGHT hands. The thing is, right-handers ("north paws?") are innately more handed than we are. To them, a functional left hand is weird. If you want to understand how to use both hands equally, you really need to have to equally functional left hands--ambisinstrous--because that's the way lefties are forced to live. However, ambisinistrous isn't even a word, which tells you how ignorant the world is.
Probably the most significant such application is in agricultural and related areas; railways and local goods distribution can get by using lead-acid or NiFe, but a battery-powered tractor is kind of tricky, and a battery-powered chainsaw is probably only going to be usable by Schwarzenegger.
According to Google, you can get a battery-powered chainsaw at your local big box store (I see three for general sale, by three different manufacturers. They're not as powerful as a gas saw, but they do work.
Similarly, John Deere prototyped an electric tractor last year. We'll see how it works out.
I think what I'm saying is that while I am also left-handed, I've never really noticed the bias of itself. There's some bladed cutting tool which has an asymmetrically ground edge to the blade and tends to wander off if you use it left-handed, but I can't remember what it actually is or what you use it for; lots of power drills have the trigger lock button positioned so that if you're using it left-handed and the bit jams, the sudden twist of the drill body shoves the button into your hand and locks the trigger precisely when you don't want it locked; but apart from that I can't think of anything off the top of my head. What I do notice the bias by is other people complaining about it in ways I don't understand - scissors are the usual example. People say something weird happens when you use scissors left-handed but it's never happened to me; they just work (or if they don't, it's because they're knackered anyway).
The bias I do notice is that against the left eye, because my right eye doesn't work. Many things you have to look through have at the least an eye-cup shaped to fit a right eye, although you can usually squash it enough for it not to matter. The only time I've ever fired a rifle it was a pain in the arse because using my left eye put the bolt on the wrong side, and my camera has various buttons on the rear for no reason, which I can't use because my face gets in the way (although the thing is a complete ergonomic clusterfuck even without that).
Mentioning the camera has reminded me of another factor which also goes unconsidered, even though it is rather more common than monocularism - reduced accommodation range. SLR viewfinders too often assume that the unaided eye can focus at infinity. Mine can't, so with my specs off the image is always blurred, and with them on I can't get my eye close enough to the hole. Which in turn means I can't use manual focussing when the autofocus can't cope...
I bet the tractor doesn't use lead-acid batteries though :) And I can't see the chainsaws being much more than toys. A chainsaw engine usually puts out around 3kW or so; supplying that from a battery you can comfortably carry isn't possible for very long, and if you reduce the power to compensate it'll be too feeble to do any good.
In the context of a civilisation mechanising while trying to avoid fossil fuels, though, you probably won't even get that much, because all the "easy" battery chemistries are heavy. The problem is you have to start mechanising agriculture first so everyone doesn't have to be a farmer, so you're stuck with simple versions of the technologies until you've got going. (Or you can use sailing ships to pinch food off some poor bugger in a distant land, but I'm assuming you're not doing that either.)
I have a vague memory of an old black-and-white photo of an electric chainsaw in use; two blokes wielding it, and a cable to a big battery sitting on the floor which would probably need at least another two to carry it. I think it is probably a false memory, and one reason I think this is that I can't imagine why anyone would bother to make such a thing when IC engines have been invented.
A heavy-battery tractor as we understand it would probably just get irretrievably bogged in the mud, but you could probably manage an electric version of the old steam ploughing engines, that kept to the edges of the field and hauled a plough back and forth across it with a big winch. Not as versatile, but still a lot better than not having it at all.
SLR viewfinders too often assume that the unaided eye can focus at infinity. Mine can't.
Are you sure? My Canon 5D II and 6D both have a little knob, just to above and to the right of the eye hole, marked with '-' and '+'. When I go out shooting I wear my reading specs so I can view the screen, which means of course that I can't focus on infinity. I set the little knob a few clicks to the - direction. My sight is better close with my specs on than it is distance with no specs (I don't have distance glasses), so I can actually judge focus a little better set to '-' with specs than I can set to zero without specs. (though the screens on autofocus cameras aren't designed to allow you to judge focus)
Interestingly, even as a right hander, I find the total lack of left handed SLRs completely baffling. No single SLR model has anything like 10% of the market share. So a good left handed model should instantly be the best selling single model of all (until everyone else copied the idea). Make it the entry level body and you've instantly locked 10% of new customers to your system of lenses. Seems like a total no-brainer.
And I can't see the chainsaws being much more than toys. A chainsaw engine usually puts out around 3kW or so; supplying that from a battery you can comfortably carry isn't possible for very long, and if you reduce the power to compensate it'll be too feeble to do any good.
I suspect the market for them is where sound maters a lot. In an urban area or next similar. But I suspect a lot of them are bought by people who didn't think the intended use through. I just bought a replacement for my worn out 2 cycle gas one. It's small. Only 14". But it will work all day if I buy the gas and oil ahead of time. Plus I have a 12v operated sharpener. You really can't buy a full day's worth of battery charge for such a saw for any amount of money which is practical. Which, when the power is off for 1 to 7 days after a hurricane, might be a point to consider.
But for cutting up small stuff I can see not having to deal with 2 cycle, oil, periodic cleaning, where to store it, etc...
But then again for those situations I get out an electric reciprocating saw with a large tooth blade. I have both battery and corded ones. And have used the corded one to cut out large quantify of roots trying to undermine my house. And a chainsaw blade which has hit dirt is basically a high powered butter knife.
There was a "bridge" 35mm film camera produced which had a left-handed variant, the Yashica Samurai. It was designed for one-handed operation, a bit like a camcorder hence the two versions. Not an SLR though.
But for cutting up small stuff I can see not having to deal with 2 cycle, oil, periodic cleaning, where to store it, etc...
I concur; plenty of people live in suburbs or similar areas and only need a chainsaw a few times a year for relatively brief periods. If the saw can run on either battery or wall power, many users will be happy with a flexible light duty tool. It's useless for professional loggers of course, but they're not the intended market.
It's basically the same comparison we've made between electric commuter cars and large delivery trucks.
Interestingly, even as a right hander, I find the total lack of left handed SLRs completely baffling. ... Seems like a total no-brainer.
You'd think. A few years back I was working for a well known multitool manufacturer. I got to looking at one of the products, then paged through the catalog, and took another look at the tool in my hand, and couldn't figure out a good reason why various bits couldn't be swapped around for a mirror imaged product. So when I had a spare minute I went to a guy who'd been there a while, who happened to be left handed himself, and brought up the idea. Before I even finished asking the question I could see that he'd been over this before. Yes, it should be mechanically possible - but nobody had gotten marketing to offer to offer one and it wasn't clear to the rest of us why such a thing couldn't be sold.
For mechanizing agriculture, you go with ammonia, and the electrosynthesis path. Farmers have to be able to handle the stuff safely regardless, so running the tractors off it is not an issue.
The good news is that most target rifles in my version of the sport come with a left-handed version; if you look at any pictures of the Olympics, etc, there’s normally someone facing a different direction to most. There are even sight adapters that allow the “opposite” eye to be used for aiming. I’ve always been aware of the latter as a coach, because my mother is left-eyed / right-handed... most UK service weapons are right-hand / right-eye only, though. Apocryphal note - the sniper in “Saving Private Ryan” aims and fires left-handed.
Another interesting comment from a friend who was an international fencer, was that nearly half of the top-class fencers are left-handed. They have an advantage in that they’re used to fighting right-handers, the right-handers are less likely to have a left-hander to train against (she’s pretty ambidextrous; injured her right wrist, changed to fighting left-handed, got back into the Scottish team).
There’s a similar line of reasoning in Judo; one of the local clubs teaches its competitive kids to work left-handed. (I actually found that some throws work better for me left-handed than right, to the despair of our instructor)
For applications where electricity just won't do, you could easily run alcohol or other biofuel (e.g. chip fat). If these fuels are planned properly, using waste from agriculture, they're not only cheap but eco friendly too. Granted, you might need to mix in some crude oil products, but you'll probably be using that for lubrication anyway.
There's surely an easier way round electric car charging times (especially if using low tech batteries). Design them all around one or two standard battery packs, and have swapping stations where you'd have petrol stations. Drive in, swap packs, pay a fee to the company who will charge, service and if need be scrap and replace the packs.
References like Java, Python, Ruby or Swift.
you could easily...
A phrase that typically makes me run away.
8 Waves hand *
Perl had references before Java, Ruby, or Swift existed (at least in public), and possibly prior to Python.
I've suggested the same thing a couple times and nobody seems interested. It does seem obvious, however. The problem, of course, is that you end up with multiple batteries for each car, probably at a 2:1 or 3:1 ratio.
Most small-bore clubs had a left-handed BSA Martini in the cupboard for left-hand shooters, or at least something with a neutral stock. Trying to shoot a regular RH-stocked target rifle left-handed left a crease in my cheek.
There were standard drills and competitive shoots in the Practical Pistol discipline that require "weak hand" shooting i.e. using the hand you don't normally use to shoot with. Some folks coped quite well, some didn't. The controls on a typical pistol (safety, slide release etc.) are right-handed and require some finger-bending manipulation to activate in a hurry if used by a left-hander (like myself). Most serious shooters had "space gun" conversion kits that made the controls double-sided just to deal with these kinds of courses of fire.
The "Courier" involved a move-and-shoot sequence with a small briefcase containing a couple of half-bricks handcuffed to your dominant wrist. That was... fun.
Non pointery references have been around for ages before the likes of perl, where in this case "pointery" means you get to do pointer arithmetic.
Tony Hoare apologised for inventing null, but I strongly suspect that if he had avoided it then someone else would have done the deed.
Yes, when I fenced left handed against another left-hander, it was a mess. Here's the little secret: lefthanders don't face other lefthanders any more often than righthanders do, and without that practice, we don't know how to do it either.
Also, as Scott pointed out, there's little point making left-handed multitools simply because the tools last for a long time and only 5-10% of the market might be interested in buying a left-handed tool, so why bother? I can't find it now, but I've seen a video of someone using a machine shop to turn a multitool left-handed. There's certainly an aftermarket for that. CRKT made a symmetrical multitool (the zilla) a few years back. It could be used by lefties, but of course it has been discontinued.
I also suspect that someone could start a left-handed tool company producing left-handed and symmetrical tools for lefties and for survivalists. As for the latter, I've long carried a (now discontinued) knife marketed as a survival knife. It can be opened one handed with either hand, the point being that if one arm is disabled, you can still use the knife. Not that this forum is crawling with entrepreneurs, but if you know someone who wants to make tools and attract a loyal following, feel free to pass the idea on.
...References...
Which (under the hood) are just pointers, which the compiler insists you point at something (i.e. "no nullptr") when you create them. Once you've seen a null reference throw an exception, you become less idealistic about such things...
http://pawlan.com/monica/articles/refobjs/
Forgive me for not seeing much of a difference between the complexities imposed by the need in Java to indicate Soft, Weak, and Phantom references and to handle them correctly (to prevent the Garbage Collector from deleting the referent and leaving you with a null reference); and those surrounding std::sharedptr, std::weakptr, std::unique_ptr etc, etc.
AIUI, Perl uses reference counting in its garbage collector - in other words, the same mechanism as used by the C++ STL smart pointers. Except in C++ it's explicit, and the point of object destruction is deterministic.
Don't get me wrong, I get twitchy every time I see raw pointers in C++ code; but they have their place and their uses (unfortunately, that place is "throughout the Windows API"; I've moved from projects where we delivered to multiple OS targets and abstracted via Boost/STL, to one which is Windows-only and a decade old, with lots of crufty old uncommented, undocumented OS interface code for the I/O stuff).
Ooops, apologies to all the Perl Monks out there - I just discovered that Perl garbage collection is deterministic...
(That's what I get for being unskilled in the industry's least-popular programming language)
Yes, one of them is a bit poor with darker areas and a hotspot, but the other two are fine. I think part of the issue with LED bulbs, apart from poor buying decisions by companies selling them and a general desire to sell the cheapest crap possible, is that the consumers have one bad experience and then swear off them forever. As with CFL's, I've a vague memory of an argumen on here maybe a decade ago with someone like Greg saying CFL's were totally rubbish and useless and myself and others saying hang on, they've come on a bit since you tried them in the 1990's, you should try them again.
CFLs were fine in the 90s too, as long as you:
1) got them from QD or the pound shop or somewhere else cheap. Cheap ones would generally achieve full brightness immediately; it was mostly the more expensive ones that took five minutes to warm up properly.
2) ignored the manufacturer's claims for what wattage incandescent they were equivalent to and went by the evidence of your eyes instead. People who continued to buy 11W CFLs to replace 60W incandescents because they thought that what it said on the box could somehow override the evidence of their eyes that you really needed the next size up have only themselves to blame.
Since nearly all my lights were generally only switched on once a day, I also used to modify CFLs by shorting the filaments with a blob of solder, so they were heated only by bombardment and not resistively as well. It meant they would strike with the filaments dead cold, which isn't ideal, but on the other hand the filaments would be less stressed in continuous operation, so with the kind of use I was giving them they lasted longer overall.
By far the best for durability were the original Philips ones from the 80s with an inductive ballast. These would keep going for years, until half the phosphor had fallen off the inside of the tube. I kept the ballast out of one of these and used it with tubes from pound-shop CFLs, which combination gave similar durability but without the long wait for it to warm up.
What I think MikeA was getting at re B&Q LEDs is the same thing that gets on my tits about them: they seem to be designed for people who think you should be staring at the light bulb itself rather than using it to see other things by, and who freak out if it isn't the same shape as an incandescent usually is. So you get a small number of high-powered and fairly directional emitters mounted inside a casing which is shaped like an incandescent, but unlike an incandescent only illuminates things underneath it and not things round the sides.
When I say "LED bulbs are great" what I mean by an LED bulb is the type with a large number of low-powered emitters mounted on the sides and end of a roughly cylindrical substrate, and it seems that places like Amazon and ebay are the only sources for this type. b22 corn bulb should be a useful search term.
Early days yet, but I think my durability mod for these is going to be to buy one with 3/2 times as many LEDs as the manufacturers think are needed for a given level of illumination, and then replace the ballast capacitor with one 2/3 of the value of the original. Much of the research effort for LEDs has shifted from making them more efficient, to making them keep working at high temperatures - Tj of 180°ree;C or something ridiculous. This means that, almost uniquely for a semiconductor device, they will almost carry on working right up to the point where the smoke comes out instead of conking out long before, and the manufacturers, who have an ingrained aversion to simply making the bulb big enough to have an adequate surface area for heat dissipation, seem rather too keen to push this capability further than it can really go.
Usefully C++ actually had non-nullable pointers as an option. Because C++ is all about the options. I'm also slightly in love with Idtypes.idl which gives me a non-converting integer subtype (so "phonenumber = planduration_months" does not compile).
Sadly for me I started my current project just before Rust was capable of doing what I needed, so I have 3-4 years worth of C++ experience after working in slightly more competent languages for the previous 20-odd. I have spent no small amount of time finding the wee utility classes/templates that make C++ slightly less ugly to use. I find the "C++ Frequently Questioned Answers" useful whenever I discover a new cool trick ... if it's in the FQA it should be used with caution (and thank you to MySQL, whose C++ library throws things that are not exceptions. Of course you can do that, it's C++!).
IMO the Commonwealth would do well to largely ignore all the primitive computing stuff and go straight to 32 bit low power ubiquitous microprocessors. I think focussing more on the problems we are having now makes sense. It's like many African countries going straight from "one telegraph line" to "cellphones everywhere" with no intervening rollout of landlines.
They could quite reasonably decide on a silicon scale that balances manufacturability with resilience with power consumption. So rather than trying to get to 8nm silicon, buy a bunch of cheap-ish commodity 50 to 100nm fab tech and run with it.
That would go with using a modern language/environment as the base, so that mutithreading is less difficult (async everywhere) and having lots of cores becomes easier than trying to hit 4GHz all the time.
I say that as someone who spent 10 years being "the Delphi threading guy", wandering around making threaded Delphi code work properly for various companies. Ye olde worlde procedural languages (with optional object orientation) just make multithreading unnecessarily difficult. Better languages, taught from the start, make asynchronous programming much easier. Rather than forcing kids to learn "when you call a subroutine your code stops and waits", and "when the user clicks a button everything stops while the button click code runs", you teach them that everything uses events and callbacks (which are the same thing). "you write a bunch of independent code fragments, and link those to events that can happen". As they say, if a ten your old can do it what's your problem?
Ah, the Philips thing I think explains something. My dad moved out of the ex-family house last year, and the CFL bulb in the garage failed around that time. He had installed it when we moved in, about 28 years earlier. THat bulb must have been on for days at a time, and yet it lasted so long.
As for LED's, what I think he means is like the issue with the one I have which has an attempt a lense to diffuse the light, but actually it isn't very good and leads to dark and light patches directly underneath it. See, language is difficult because it sparks different chains of association in different people's brains. But you know this. I don't think enough people do though. The one near me just now, I can't tell what it is like inside the round bulb housing, but I think that is designed well to diffuse the light across a hemisphere.
So, why do LED's need ballast in the first place? I thought CFL etc did because they do funky plasma stuff, led's obviously don't. Also the number of LED's will surely be irrelevant, because you can get them in all sorts of shapes, sizes, power, efficiency etc.
It's also not just an issue of surface for heat dissipation, also designing for heat transfer in the first place. You see this with high powered led torches, which I have read a great deal about and own a few. The expensive ones often stick the LED onto a brass or copper lump which is then in contact with the metal case, so can get rather hot in operation but does mean the LED doesn't burn out.
I wish I had a knob like that. Never seen one though...
Hydro is the obvious starting point for any plan for cross time uplift, because, well, you could probably manage to build dams with social organization equal to the Ancient Egyptians
... as we can see by looking at the actual dams built by actual ancient Egyptians. I'm kind of eyerolling at this whole discussion because of the use of hydro and wind power that we know of going back quite a long way. It's the electric part that's new and exciting. Wind mills and pumps likewise... just think about the name for a second "windmill"... why is "mill" in there, it's not as though they're used to grind the wind.
It would be quite possible to start with high-torque, low-speed water power applications, just like our ancestors did, and push the technology forward as their industrial base advances. Just knowing what works and what doesn't, in detail, would help a lot. Forget digging canals everywhere, build railways. We know that... now. Canals are for moving ships between oceans, not pallets of stuff between towns.
On that note, palletised and containerised transport would make a huge difference. Like an army, civilisation marches at the head of a long logistics tail and cleaning that up has flow-on effects everywhere. It's not just about you getting your Amazon order overnight, it's about just in time manufacturing which means you don't have to fill a huge pipeline before anything useful comes out the end. And so on.
I've suggested it too... indeed I seem to remember it was how everyone expected electric cars to develop when there weren't any. You just decide on a standard size battery and then every car has a removable rack that holds a suitable number of these for the size of the car, which you can swap in and out with something along the lines of a pallet truck. The main objection people seem to have is "electric cars now are not designed to allow that, therefore they never can be", which makes no sense.
Canals can move a very large amount of material slowly with simple geo-engineering, usually by modifying existing waterways or connecting adjacent rivers with channels and locks. Canals also tend to start/end at ports which are usually on estuaries because of riverine traffic, the precursor to canals. Chickens, eggs.
Railways require tunnels, bridges, a lot of geo-engineering and movement of soil and rock. The steam shovel and dynamite are the main tools that made railways workable other than in a few places where the geography was already conducive. They also need a lot more ongoing maintenance than canals usually require. They do work better in dry areas though.
So rather than trying to get to 8nm silicon, buy a bunch of cheap-ish commodity 50 to 100nm fab tech and run with it.
There's no "commodity" fab tech out there, it's all custom-rolled and usually under ITAR-like controls as the military applications of silicon fabs are obvious (non-crippled GPS chips, for example, the sort that don't quit working when velocity exceeds 200mph or whatever). The Commonwealth needs to develop their own fab hardware based off existing knowledge but a lot of that is proprietary, not open source and somewhat arcane -- there are horror stories about cursed fab lines built with the best modern kit and finest engineers that took years of pray and try to make them productive.
I've not seen anyone claim to have Hacklab-quality homebrew silicon chip production systems like 3D-printers and desktop CNC which are commonplace in our timeline today. Some of the highly toxic chemicals involved might be one of the reasons, I suppose.
Yea, but I still would not touch a cfl, especially now that led's are available. OK you still get the occasional dodgy led "bulb" but it is so obvious that they are "The way to go. Our local authority, who can usually be guaranteed to fuck it up, have replaced (as far as I can see all of theor normal street-lamos with led ones. The night vision for the punters is much better & they must be saving silly sums of money on the 'leccy bills! Soon, the only place I won't have led lights will be the kitchem, which has two "strip" lights, i.e old-fashioned vapour-discharge tubes - bright, uniform, almost shadowless, good spectrum.
NO The steam-shovel didn't really come aloing until well after railways were already well-established, like 50 years or so. And gunpowder was used for tunnel blasting until dynamite came along ( patented 1867 ) - also well after main-line railway construction. Also canals are only really much use if the gradients are very low, even railways can climb much better than that, even with steam traction .....
Canals... Well, the obvious example isn't the Erie canal, it's the Chinese Grand Canal, the biggest canal ever built AFAIK. It's still (partially) in operation, although a glance at its 2500 year-odd history (!) shows that keeping a canal in operation isn't as simple as digging a big ditch, even if you have an effectively unlimited number of peasants to sacrifice to its construction. Right now, a good chunk of it isn't even usable.
The other thing about rails is that you need a lot of energy to make them work. This isn't about running the trains, which are about as efficient as you can get for overland travel. Rather, the problem is all the energy you need to build the rails and create (and maintain) the straight tracks. They're also kind of resource intensive. IIRC, forests (in places like India) have disappeared into making sleepers for British Imperial tracks.
LEDs need ballast because they are (approximately) constant-voltage devices. Above their turn-on voltage, dV/dI becomes very small, so a tiddly increase in applied voltage causes a many-fold increase in current, and this in turn makes the smoke leak out. So you need to connect them in series with something else that limits the current - a resistor will do, and this is what is used for indicator LEDs. For illumination you want to use something less lossy, and if you're running off AC mains the simplest such thing is a capacitor.
One advantage of using a large number of small emitters is that you can string them all in series to produce an assembly with a large operating voltage but a small current requirement, which means you only need a small ballast capacitor. With a small number of large emitters, the operating voltage is small but the current is large, which means a ballast capacitor would be impractically large, so you use some kind of switched-mode converter instead, which is more complicated, more lossy, and more likely to go wrong.
The other difficulty with large emitters is getting the heat out of the actual LED chip, because of the high power density. You need to mount the chip directly on something with a low thermal resistance to let the heat get away from it easily enough. So you end up with a hefty chunk of aluminium or copper as closely coupled to the chip as the requirements of electrical insulation will allow; this may not be so much of a problem with your torches, but it is undesirable for a more powerful general-purpose light bulb because of the weight, bulk, materials consumption, and cost. With smaller emitters the scaling factor works in your favour and it is much easier to get the heat out of the chip.
Canal locks are often easier to put in than a railway tunnel, cutting or embankment to alleviate a big shift in levels. There are consecutive sets of locks that can move a boat up or down several hundred feet such as the Caen Hill lift in Wiltshire (19 locks lifting 237 feet, a gradient of 1 in 44), now restored after falling into disuse.
The thing is, canals are long skinny pipelines - you just don't get thousand tonne loads doing 100kph in a canal. They're also short, there's no real prospect of a transcontinental canal (the Amazon river is probably the closest approximation, but it's not practical to get it over the Andes no matter how well canals deal with elevation changes).
Certainly making the rails themselves takes a lot of energy, and one of the obstacles to railways really getting going in the first place was producing and handling sufficiently good quality iron in sufficiently large chunks. And it probably helps to have canals to move the coal, ore and rails about. But building the trackbed to put the rails on is an operation that can be fuelled on cows and beer, and apart from the manufacture of replacement rails, maintenance is also mostly a low-energy operation (though you do get "hot spots" that need constant attention, most of the mileage isn't like that).
Round here we have examples of both, more or less side by side - the Tardebigge flight of 30 locks, and the Lickey Incline, 2 miles at 1 in 37. The Lickey, though more massive, is much less complicated a piece of engineering. Both have been an operational pain in the arse since they were built, but the locks rather more so, whereas the Lickey has become much less of a problem in recent years as engine power has increased. To cope with much the same difference in elevation, at Ironbridge they built a special railway to haul the canal boats up and down the hill.
I'm more used to Australia (flat, so ideal for canals except for one tiny problem with the water supply - flood or drought) and Aotearoa where the landscape is a bit too active to make canals viable. It only takes one leak and your whole canal stops working. Ok, that and most of it is porous as all get-out, so you would need a great deal of sealing work as well as a very large number of locks. The Rimataka Spiral springs to mind as an example of something that would be doable but tricky for canal engineers. It was tricky for the railway ones :)
http://nzetc.victoria.ac.nz/tm/scholarly/tei-Gov11_06Rail-t1-body-d12.html
No, but you do get to move 1000t of cargo using maybe 50hp...
Canals work because they are pipelines; for every so-much stuff you put in at one end you can take out so-much stuff at the other end, and it doesn't matter that it's not the same stuff because one lump of coal or limestone or whatever is very much like another. You set the overall flow rate to match how much you need on average, and if at some point you have more coming out than you happen to be able to use at the moment, you can always make a big pile of it and keep it for when the flow's not quite enough. It works just fine as long as you remember to shoot people who tell you you can't plan ahead or store things.
THat [CFL] bulb must have been on for days at a time, and yet it lasted so long.
Not surprising. A while back I had a Phillips CFL fail on my, base turned brown and melted, so I contacted Phillips about it and sent their QC engineer some pictures. He wanted to know where it was mounted and how often it was turned off/on, because apparently the two most common causes of failure are overheating from being in an enclosure, and being turned on-and-off rapidly (from, say, being in a closet or hallway).
I remembered from Reliability Engineering classes decades ago that light bulbs burning for years weren't uncommon — it was the shock of turning them on that causes burnouts — and apparently the same is true of fluorescents.
1000t of cargo using maybe 50hp...
I'm seeing 100 tonnes per horse reported as an upper limit, which suggests those are pretty heavy horses you're looking at. I mean, I think canals are cool and I'd quite like to play on one, but I'm also aware from playing with boats that my interest will be short term. I can move a 200 tonne boat by hand, but I don't want to have to drag it 100 kilometres.
It's less about what can be done, and more about what actually works in practice, though. Which is why we don't use (or build) canals for bulk haulage any more. The Commonwealth should really look at this and say "is building canals the best use we have for labour" and IMO decide not. For the same reasons we do today.
Riverboats might be a different thing, there's a reason big cities are often built in estuaries or on navigable rivers.
"The main objection people seem to have is "electric cars now are not designed to allow that, therefore they never can be", which makes no sense."
Battery swap first started in 1896 (no, not 1986, Eighteen Ninety Six).
The model S from Tesla was specifically designed to have battery swaps. Several other lesser known cars as well. Tesla built a few battery swap stations near the places with the highest density of Teslas, along routes to destinations with a lot of traffic. They swapped batteries in approximately 1/3 the time it took to pump a tank of petrol (I'd link to a youtube but I've got an idea that you're in a text only environment).
Tesla kept records of the use of the battery swap station. At first they had quite a few people do the battery swap shuffle. However they had not one single person do it twice. Never, none nada. (well according to Elon anyway) It appeared that either people were willing to try it but found the experience not to their liking, or they wanted to try something they knew they didn't want, just to try it out. Either way, it was dead in the water. The stations sat there for something like a couple of years, gathering cobwebs and eventually they just closed them.
A company called 'Better Place' had a similar experience with different models of cars, but since it was their only business they folded up.
So rather than it being something that no-one ever thought of, it was actually about the first thing they thought of, many cars were designed to do it, but no-one wanted it.
I just looked at some of the details of those events. I had genuinely thought that it was 'quiet a few' who tried the battery swap stations. A little research uncovered that of the nearby drivers who were personally invited by Elon to try the battery swap system, about 2.5% took up the offer and it was announced in the share holder meeting that not one of them had done so twice. One of the stations that cost several million to construct had a total of 6 battery swaps in 2 years.
In the end they decided to fit an armour plate to the bottom of the cars to protect the battery from debris (after a total of one car was damaged) and that ended the battery swap program.
Why bother with high tech wind farms?
That was my thought as well, but I approached it from a slightly different angle. Why not start with low tech wind farms?
I used to subscribe to a magazine called Mother Earth News. It was all about applying 19th century technology to creating low cost 20th century solutions.
Back about the time I was reading the Merchant Princes series the archive was available on CD-ROM. Looks like it's been migrated to newer media now.
Still, Miriam could have scored a copy of those CD-ROMs & maybe a Whole Earth Catalog (subtitle "access to tools") from a used book store to add to the tech library she took across to help force growth in the New American Commonwealth.
Seems to me a lot of low cost, wide spread 19th century technology could jump-start that growth because you wouldn't have to expend as much capital creating the precursors needed to move on to the 20th & 21st centuries
Interesting. I wonder whether there was some kind of unexpected inconvenience factor involved, or whether people who buy electric cars do some odd subset of driving in which they can perfectly plan their routes, times, etc.
I know that as a travelling network engineer I can be re-dispatched to someplace a hundred miles away (and with no infrastructure) on a surprise basis, so for me to own an electric car I'd have to be able to stop somewhere and change the battery in a manner similar to filling my tank.
On the other hand, my friend owns a Nissan Leaf. He drives it to the train station on every work day and plugs it in, upon which he charges his battery for nothing.
And what about people who are making long drives? I drove from Southern California to Oregon during the summer, and of course that was only possible due to being able to fill the tank quickly.
In short, lack of both necessity and battery standardization. IMHO the "battery stations" happened too early.
Short? Please read up on the thousand-mile long Grand Canal. Now I'll admit that the reason it works is that eastern China's basically a plane (canal has an elevational change of 42 m across that).* Still, it's not quite what you'd expect if you're thinking English or American canals.
*One of China's eventual problems with climate change is that if all the ice sheets melt, the Grand Canal and most of the lands east of it will be underwater. That will be thousands of years from now, but they might want to start planning for it.
One thing strongly in favour of canals for the NAC is that they're far more resistant to bombing damage than railways. (aqueducts aside)
Not just China's Grand Canal but Egypt's Canal of the Pharaohs, built millennia before Suez. The advantages of moving ships from the Mediterranean and/or the Nile to the Red Sea have been obvious for literally thousands of years.
And there's another point for low tech transportation. Railroads as we have them presume the ability to cheaply make vast amounts of iron rail, heavy iron objects identical to decent tolerances and spammed out in great numbers. We here are the kind of people who observe that technical point; there's also an economic point that iron has to be dirt cheap before this happens. The peasantry, no matter how downtrodden and impoverished the aristocracy wants to keep them, must have plenty of iron plows, knives, horseshoes, scissors, nails, and everything else a village blacksmith might make. They need to have these before anyone starts leaving tonnes of iron out in the countryside at night, or bits of the rail system will be missing in the morning. Iron has to be cheap and plentiful, which wasn't always the case a few hundred years ago.
IIRC, forests (in places like India) have disappeared into making sleepers for British Imperial tracks. No Both Teak & Jarrah woods were early recognised as very valuable, but also of limited geographical extent - but also renewable. Plantations & a rotation regime were set up. IIRC, re-hased/reorganised continuations of those re-planting cycles are still in operation.
Maybe, matbe not - but You can get a 2000 tonne barge from the Netherlands to the Russian border - once a year I sit by the junction of the Dortmund-Ems & Mitteland Kanals & watch the barges go by!
NO Certainly not with precision bombing ( a.k.a. a cruise missile ) You drop a really large one into the lock when it's full - the bottom gates will blow, then you drop/aim one into the top gates ..... If you drain both ends of the "top pound" then it will take some time to repair - lock gates are large things.
Are the French that accurate?
Anyway, I forgot about locks... You're certainly right. The other thing I didn't think of is that dams are usually needed to supply/control water for a canal system. I think the dam buster squadron proved that dams are vulnerable to air attack even with 1940's tech. An earth wall dam might survive, but probably not and all the water control gear would be destroyed.
Still they're thinking about corpuscular petard attack. A direct hit on a switching yard means no switching yard. A direct hit on a big canal bay means... well a bigger canal bay. Canals would be pretty much immune to an airburst/shockwave/heat. The petard would need to parachute down and explode on contact with the ground/water. In the face of corpuscular tipped SAMs, that might be difficult.
Don't bet on LEDs being more reliable. I have them in the bathroom and on bicycles, and most of them are less reliable than CFDs. As with all fancy technology, it's down to how well and conservatively they are engineered. And, as far as street lighting goes, what we need is less of it - just as we need less requirement for and dependence on driving, rather than just a new motor technology. And, as usual in the UK, the claimed benefits of a new technology or even gimmick are used as a diversion to avoid addressing the real problems.
Ordinary incandescents are good for a great many cycles, but have a limited lifetime; fluorescents are the converse. That is, of course, why the first widespread domestic use of fluorescents was in bathrooms. Yes, really.
"On that note, palletised and containerised transport would make a huge difference."
Oh, yes! If you're talking 1950s-60s technology then without containerization a majority of freight cost, and time taken, for shipping is the loading/unloading cost at port.
I've read an argument that the efficiency gains of containerization pretty much caused the globalization wave of the 1970s to 2000s. Not sure I believe it was "the" cause, but it was a big part of it.
The trouble with getting there is, of course, is overcoming network effects. No point having container ships without ports, or ports without ships.
The deadlock was largely broken by the USA's Dept of Defense pushing containerization in the mid 20th century. But the US DoD had been seriously scarred by the failure to get goods into France after Normandy - and had a Korean War on. So they had both motive and means to influence how shipping worked. Not sure if the Commonwealth could do something similar.
Actually if I was kicking off NAC infrastructure, the main thing I'd do would be to have a decent broad gauge rail. I was frankly amazed when I saw the tracks in NAC were about the same gauge as our timeline. I'm sure there's lots of rail aficionados here who will be happy to correct me, but I always regarded the fact that the broad/narrow debate went to the narrow to be a tragedy. That would also have strongly influenced the size of multimodal shipping containers, and thereby, roads. Frankly, if I was designing from scratch, I'd want the rail gauge to be double or triple the current ones.
You don't HAVE to fuck up the design of references, you know. How they are implemented is irrelevant - the key factor (if they are done right) is that they are scoped more narrowly than the object they point to, so are always valid. That is confounded (nowadays) by them being immutable (i.e. always pointing to the same object), which means that they cannot be used to implement the genuinely 'pointer-based' algorithms, but a less restrictive class of mutable references is possible (vide Algol 68 and others).
C++ allows you to implement classes that provide the equivalent checking but that (a) can't be checked statically and requires (b) that the class programmer is a better software engineer than most 'C++ experts' are and (c) that the user of the classes doesn't use any of the languages back doors and worse, deliberately or accidentally. That last is a real killer, because it's also beyond most 'C++ experts'.
Broad gauge requires more geoengineering for cuttings, tunnels etc. and only provides a small amount of extra benefit in moving, for example, large armoured vehicles to warzones. Japanese local services run on a 3 foot 6 inch gauge, including express services at 150km/hr as well as containerised freight although it is not typically TEU standard because of the smaller gauge and the resulting smaller tunnels.
The high-speed shinkansens operate on a classic British gauge of 4 foot 8.5 inch rail but in a totally separate network.
Soon, the only place I won't have led lights will be the kitchem, which has two "strip" lights, i.e old-fashioned vapour-discharge tubes - bright, uniform, almost shadowless, good spectrum.
For the past few years there have been drop-in LED strip lights — same form factor as the vapour-discharge tubes. You need to replace the ballast circuitry but then you're good to go with LEDs.
Given how long those tube lights last I think you may be using them for a good few more years, but when they next burn out a good quality LED tube should see you out for the rest of your life.
It actually went to the 'middle' gauge - many other gauges were derived from mining etc., and were much narrower. As big as you favour would make them unsuitable for many uses, including commuter transport. But, yes, you are right for long distance, high speed, and heavy freight, because many of serious problems go up with the inverse square of the gauge.
whether people who buy electric cars do some odd subset of driving in which they can perfectly plan their routes, times, etc.
Ever spent a quarter of an hour at a Tesla showroom?
Their cars come with a big-ass screen as its main user interface. There is a GPS/satnav/moving map system, naturally. It also takes into account battery charge and availability of charging points when planning a route, so for long journeys it navigates point-to-point between charging stations (which in the US and UK are rolling out first in areas densely populated by Tesla owners, and second, along interstates/motorways). Upshot: you have to override a bunch of warnings to drive out of range of a charger point, and if you want to drive coast-to-coast in the USA your car will give you a route which, while not necessarily the shortest road distance, will get you to your destination without a flat.
As EVs become ubiquitous the shortest route and the optimum route will become the same. And given that the range of a high-end Tesla is now pushing 300 miles, taking an hour or two off for an enforced meal/sleep break while the car recharges for the next four hour drive would seem like a good idea ...
Note per novel: the Commonwealth has had railways for about a century at this point, so there's lots of in-place infrastructure already on the ground—entirely parallel evolution, hence things like the different signal light conventions (noted in "Empire Games").
The real breakthrough they've made relative to the USA is to electrify their freight tracks (and provide segregated tracks with in-cab signaling for high speed inter-city rail: domestic air travel is still relatively primitive, but they've got trans-continental sleeper service with under-24-hour travel time from NY to the Bay area (and a freight backbone from the Bering Straits to Tierra del Fuego).
"only provides a small amount of extra benefit in moving, for example, large armoured vehicles to warzones"
Which would probably be because large armoured vehicles are designed to fit on existing rail stock. They'd probably be bigger if they could be, given that they're exactly as large as they can possibly be and still fit through European rail tunnels. It seems like that's a design constraint. (Now I've annoyed the tank aficionados too) It's also a constraint on the size of rockets (Elon says the F9 would have been fatter but it wouldn't go on a train). It would be much easier to move factory kit around if they could go on trains and or be made in bigger chunks. So you could probably roll out more factories in a given time. There was some discussion of wind turbines. 3 times wider could also mean 3 times longer for each carriage. So blades would fit, as would nacelles. The length of a carriage is becoming a limiting factor for wind turbine design in our timeline. 55 metre blades seem to be near the limit for rail, but the largest turbines are now running 80 metre blades and that's expected to grow.
http://www.railengineer.com/windlogisticsprojec/55-meter-blade-transport/
You'd also fit 27 times more people in each carriage if it was 3 times higher, longer and wider. Roll On Roll Off freight would work better. Vehicles the size of semi trailers could be driven on and off with ease. It seems like making the right of ways 3 times as wide, and the tunnels 9 times the trouble to dig would be more than paid off with trains that carried 27 times more stuff.
"Note per novel: the Commonwealth has had railways for about a century at this point" Ahhh, that explains it. Thanks.
Interesting. As I am in the process of replacing the 30-year old fluorescent strips in our kitchen, your post has just encouraged me to look them up. Unfortunately, I think that it will be a few years before they are a plausible replacement for fluorescent strips for most uses, for the reasons given in the following references (especially the first):
http://luxreview.com/review/2016/05/led-tubes-to-replace-t8-fluorescent-lamps http://www.premierltg.com/should-you-replace-your-t8-fluorescent-lamps-with-t8-led-tubes-2/
I searched for some where I could replace just a single-tube fitting (5'), which was surprisingly tricky, and found that it would cost 120 quid, even if I could get it in the UK, and the lack of side-lighting would still be a problem. Well, that's about triple what I have spent (at modern prices) on the fluorescent equivalent over 30 years of heavy domestic use (we live in the kitchen).
On the directional aspect, everybody uses LEDs for bicycle safety lighting. They are FAR too directional, because a lot of accidents are at junctions, where the view of the cyclist is at an angle to the cycle's direction. Yet none of the multi-led ones have them at multiple angles - I use multiple, cheap ones to achieve that effect.
domestic air travel is still relatively primitive, but they've got trans-continental sleeper service with under-24-hour travel time from NY to the Bay area
This is something I'd like to travel in. Too bad we don't have the dimensional-hopping tech.
Indeed too bad, I'd love to see some proper re-wilding projects that brought extinct animals back.
You mean like smallpox?
Given the following:
How long would it take to ramp up production and contain an outbreak if hostile world walkers decided to simultaneously reintroduce it in half a dozen cities at once.
Spending a lot more money (twice the size in rail terms would be ten times the cost for tunneling, bridges, rolling stock etc.) to cope with 0.01% of the traffic as special cases such as turbine blades and rocket casings is poor economics, unless Musk and the wind turbine folks are willing to pay for the extra expenses -- answer, no chance. My solution for SpaceX wanting to move rockets around would be to fuel them up and fly them to the launchpad...
Packing twenty times as many people into a train at a platform would pose problems in personnel traffic flows which tend towards the chaotic at the best of times, resulting in long periods between train movements in and out of stations thus obviating the need for such mega-trains.
As for Charlie's idea of a sleeper rail service covering the US coast-to-coast in 24 hours, it's not that likely to actually work out in practice especially if there are intermediate stops on the way. I regularly use a sleeper service in Japan, the Sunrise Seto/Izumo that takes about 8.5 hours to cover 600km on the narrow-gauge "limited express" network between Tokyo and Okayama.