Back to: Long range forecast | Forward to: The paranoid style in 2016

Excuses, excuses

Please excuse the shortage of blog posts. I'm up to my elbows in a novel that's eating all the keystrokes I throw at it then belching and asking for more: it should be done—at least in first draft—in another week or so, but in the meantime I don't have much energy for other writing.

Novels are pretty much the most complex form of written fiction you can work on, unless you're the kind of glutton for punishment who goes in for long series works in which each novel is an episode. Of which I'm guilty: the current millstone tied to my neck is book eight in an ongoing series with a lot of back-story, so I'm constantly weaving long-term plot threads together. And a side-effect of scale is that with seven earlier books and nearly a book's-worth of short stories to draw on, there are a lot of loose strands that ought to tie in somewhere later on. One of the cruel paradoxes of such series works is that they take so long to write that the author is doomed to age-related declining memory function just as the work becomes so complex that they need the voracious retention capacity of a young adult to grapple with it. I wrote "The Atrocity Archive" when I was 35; I'm now grappling with "The Delirium Brief" at 51, and I reckon I need to finish the core of the Laundry Files series before I turn 60 lest I lose the plot completely somewhere in the final volume—provisionally titled "The Alzheimer's Tract".

(That was a joke, by the way. Ha. Ha. Ha.)

Another problem with series works is that, unless you were at pains to plain the thing as a unitary whole when you started it, the background expands like Elephant toothpaste (or maybe Mercury(II) thiocyanate decomposition in the case of the Laundry Files). "The Atrocity Archive" wasn't so much planned as it just happened—a one-shot short novel written with no thought of sequels. In fact, the only way to make sequels work was to make Bob a horribly unreliable narrator (a role at which he excelled). By the time I introduced other viewpoints into the series (Mo in "The Annihilation Score", Alex in "The Nightmare Stacks") I had a pretty good perspective of what the world was about, and a rough idea of the trajectory of the 12 book story arc. But I also had so much back-story to keep track of that bits kept dropping off my radar, because middle-aged memory is not my friend.

Do you want to know the real reason George R. R. Martin's next book is late? it's because keeping track of that much complexity and so many characters and situations is hard work, and he's not getting any younger. But George has pioneered some fascinating coping strategies for the authors of long series works, which can probably be summarized as Martin's Law: "When the plot metastasizes, the best form of chemotherapy is to massacre your protagonists". (Personally I find a bijou nuclear apocalypse works best, or failing that, having an Elder God return to dine on everybody's soul.) Next, you've probably heard of playwright Anton Chekhov's rule: "if a gun is placed on the mantelpiece in the first act, it must be fired in the third". To which I'd like to add an observation of my own: if you're writing a series work you're allowed to leave the guns on the mantelpiece between books, but sooner or later you need to pull all the triggers or your readers will forget about them.

So that's what I'm doing right now: I'm running round the metaphorical house, pulling guns off all the mantelpieces and firing them wildly. Ahem: most of them. And I'm taking care to position a couple of new guns where I can find them for future books (or, in one case, a Chekhovian tactical nuke). And that's why your weekly blog essay is delayed.

95 Comments

1:

Could you ever see yourself collaborating with another author on a book in one of your series? I'm genuinely interested, in the same sort of educational vein as your really interesting posts on the nature of the writing and publishing business. The answer "No, because..." could in itself be pretty interesting.

I know there are various models for this sort of thing; simply taking money to allow someone to write in your universe (licencing your universe, so to speak), or you being the plot man and your collaborator crafting prose for you to approve or not (which, as an aside, I think might be interesting given that you deliberately modelled various Laundry novels on the styles of different writers - I enjoyed that and sometimes wonder how various different authors would have written the same story), and I'm sure a plethora of variations I've not thought of.

I'm not asking or suggesting that you do this; just wondering what the logistics of it would be and all that sort of thing, given how entertaining and informative other posts you've written on the publishing industry have been. I routinely point aspiring authors at your publishing posts.

2:

Charlie already did that, with .... Cory Doctorow - "Rapture of the Nerds"

In the meantime, last night, someone proposed to me that we write a 60-years later update to a classic railway history, as witnessed by the said authors. We've only got until 2019 to get it done, but there would be a hell of a lot of researching & data-mining to do - oh & I'd have to go round & update the illustrations by taking some informative pictures, ahem.

3:

OGH approached an answer to your question here.

4:

"Martin's Law". I like that. Also applies nicely to DC/Marvel 'let's kill everyone in the multiverse' publishing strategies.

5:

Was "Rapture of the Nerds" part of one of Charlie's ongoing series?

6:

Could you ever see yourself collaborating with another author on a book in one of your series?

Speaking from experience (I've written a collaborative novel before) a collaboration between two authors is a process whereby each writer does 75% of the work for half the money.

And with the amount of back-story in a pre-existing world? Nope, just nope, not going there. (The back story isn't just detail, it determines the trajectory of the future story and locks stuff in.)

7:

Given that Basilisk Guns are a note feature of the Laundry series, and given how good at surviving otherwise-lethal events some of OGH's protagonists can be, I have the sneaking feeling that even a nasty cock-up with the London Ring of Steel camera system would not be enough to thin the crowd very much.

I would also note that this sort of thing has occurred before; in Glen Cook's detective fantasy series, one character is only referred to as "The Dead Man". He is definitely extremely dead and has been for a couple of centuries, but that doesn't do much more than slow down his species and force them to devote more time to the intellectual as opposed to the physical.

I would therefore suspect that the Laundry, together with the extremely binding geas upon all employees, can call upon a very, very large supernatural workforce in times of crisis, which is likely not going to be all that conducive to the mental stability of the author.

8:

Yesterday I tried to work on proofing the 1st draft of the novel I finished a few months ago and found all my notes were gone. I had saved it as an epub to iBooks and was highlighting and making notes that I could email and print later to add back into the file. Apparently as I was trying to save something from Safari to iBooks as a PDF the other day it crashed and seems to have reset some things, somehow, and a few old books were shown as new.

On the plus side, as a test I had emailed the notes to myself a couple months ago, so I at least have the notes from the first quarter, but I've lost all since. Well, proofing was going slow because I've been writing a prequel-ish novella, so I'm upset but will live, and maybe make better changes (trying to keep positive). Guess I'll have Draft 2.25 when I get done with round one. Meanwhile, rethinking work practices --like backing up the damn notes regularly. Pfffffffftt...

9:

Sounds interesting...

10:

Mo in "The Delirium Brief"

The Annihilation Score, surely?

11:

That sucks, sorry to hear it. I had some problems going back and forth between Scrivener and Word when I was finishing up Hot Earth Dreams. It turns out you can transfer footnotes from Scrivener to Word, but not back. Of course, going from Word to Smashwords Premium turned out to be impossible without $$$ (because tables. And footnotes). It's interesting how, even when we think that technology should make us free, it still dictates our production flow and even the form of the novel. For example, a text optimized for an eBook shouldn't have sarcastic footnotes and can't have an accurate index.

Getting back to the novel series and the original topic, I think Mercedes Lackey had a good idea for things like Valdemar: she did trilogies starring protagonists, then retired them. While former protagonists often made cameos in other trilogies, they were rarely the stars of those shows (I'm only hedging my bets because it was a long time ago that I was into Valdemar). The "...and they lived happily ever after" expulsion is a good one. They start young, have adventures, get their rewards, and then fall off into management, while the next generation of young heroes comes along to tackle the problems the first protagonists made resolving their crises.

12:

Well, this post implies you are done with copy-edits on the other 4 books, so that's a major positive at least. You gave the impression over the past few months that doing that sort of work really sapped your creativity and enjoyment out of being a writer. So soon you can get back to the parts you love.

13:

Takes conscious effort to backup onto disk, so I got into the habit of emailing my in-progress work to myself just as I was shutting down for the day. This also allows me to jot a to-do/reminder list for the next day into the body of the email. Also do the regular disk/desktop-save thing ... and always put in the date/time of each revision/update.

Why I do this: Over the years several PCs/laptops have died on me or their innards got scrambled BUT my email was always, always accessible as soon as I could get at another machine. Good to know because a dead machine/laptop in no way changes the fact that a report is still due on a particular date.

Rescued machines now serve as emergency back-up plus extend my ability to use a greater variety of resources/docs on an as-needed (short-term) basis. Something else that I've learned is that when having to use multiple reference docs at the same time, it's much easier and more reliable (i.e., fewer reading errors) to have each doc specifically located in one PC/laptop than constantly switching windows/views. When I get really antsy about having to work with/across multiple documents concurrently, I also stick different colored post-it notes on top of each machine to remind me that this machine (PC1) has report/doc A, the next machine has rep/doc B, and so on. This works best/only if you can do this type of work from home.

14:

My bad: The Delirium Brief is of course a Bob novel. Although with a few bits from other POVs. Iris Carpenter's, for example ...

15:

Nope, those other books aren't done with yet!

16:

Looking forward to seeing her POV. Gonna wonder whats going to happen to cultist bringing about CNG after they 'win'. I imagine their objects of devotion will need ketchup.

17:

I think that William Faulkner had it right: "The past is never dead. It's not even past.", as you imply with your remarks about guns :-)

18:

51? You wimp.

After far too many years, I finally got the push I needed this past (US) Memorial Day, when I was introducing one of my daughters (who is also a programmer, Charlie) to a beta reader friend, after she'd gotten 135,000 words serious about the NaNoRyMo, or whatever it's called. So, esp in the last month or two, I've gotten together the last novel my late wife had done (she wrote, I pushed plot, so it was a collaboration) that we had sent out once before she died. Now that I've moved it from WP to Word format in LibreOffice, and reading through it, I'm unexpectedly finding "[need to expand here on stuff]" in a handful of places, so I'm having to add that in. The good part of that is that, 20 years ago, 44k words was a perfectly good novel length.... Now, we'll, it's headed towards 50k, and could well go a bit over, though not much.

And that's my excuse for last night, when I should have fired up my new work laptop and tried, once again, to log into it at home, where I haven't been able to since I got it.... (Get it working now, rather than waiting for when I need it....)

And Alzheimer's? My response is to ask how you expect me to remember this current trivial stuff, when my mind is jam-packed with Important Information.... Um, would you like me to sing you the theme song to the 1959 TV show, "Robin Hood"?

mark
19:

I don't have any particular comment on the content --- other than the always unhelpful 'write faster so I can buy your books, dammit' --- but there does seem to be a really disturbing imgur link mispasted into the beginning of the second paragraph:

imgur.com/aXI6oeR.gif

20:

Do you have any tricks for keeping note of important details in previous books? I can't imagine that rereading the entire series or trying to keep everything in your head is really feasible right now.

Alternately, would it be useful/a good idea if your readers enumerated details (say, something like Memory Alpha or WookiePedia, only smaller and related to your ongoing series)? Or, would that make plot twists easier to spot in a way that is to the detriment of the readership?

21:

IN THE ..Mean ? Time ..and across in the Twitter -verse? ..

" Charlie Stross ‏@cstross 29m29 minutes ago

NB: I just passed 2200 words today and my brain shut down. I am now rebooting in full idiot mode. Weekend off work and AFK impending."

Just above that POST ..beg your pardon whilst I Play a few bars on the Sherlock Holmes Violin and mull upon .. 'My Dear Watson ! ' ..

OUR Host has certainly Noticed the Post above His ..as re-posted by Himself

" I worked with someone who was involved in testing a porn filter. It kept flagging up pictures of Margaret Thatcher."

Ah, HA! ..Hum, one or two more !!! marks, perhaps .. to add that very necessary narrative Impact?

And thus onward to ...

" Margaret Thatcher: The eyes of Caligula and a killer wardrobe " ..

http://fashion.telegraph.co.uk/article/TMG8765646/Margaret-Thatcher-The-eyes-of-Caligula-and-a-killer-wardrobe.html

Too Late for Works in Progress of course but there's lots in the latest/most recent Laundry Novels - Super Hero Uniforms - to suggest a Fashion Industry Based Short Story in the forthcoming Anthology ..

22:

Re: 'Alternately, would it be useful/a good idea if your readers enumerated details ...'

Or, readers coming up a series of a multiple-choice type quizzes - one quiz per character? The correct answer would include book, page and relevant quote.

Some authors release a 'Guide to ...' book that summarizes/highlights all of the characters and main features of that world. Such an author's notebook helps maintain interest in the series between titles and keeps cash flow flowing.

23:

Fixed. (I blame the cat; the imgur link was really meant for twitter.)

24:

" .. (I blame the cat; the imgur link was really meant for twitter.) "

I feel sure that if Ernst Stavro Blofeld had been brought before the ICC ..

" .. The ICC has the jurisdiction to prosecute individuals for the international crimes of genocide, crimes against humanity, and war crimes. The ICC is intended to complement existing national judicial systems and it may therefore only exercise its jurisdiction when certain conditions are met, such as when national courts are unwilling or unable to prosecute criminals or when the United Nations Security Council or individual states refer investigations to the Court. The ICC began functioning on 1 July 2002, the date that the Rome Statute entered into force. The Rome Statute is a multilateral treaty which serves as the ICC's foundational and governing document. States which become party to the Rome Statute, for example by ratifying it, become member states of the ICC. Currently, there are 123 states which are party to the Rome Statute and therefore members of the ICC." ..

HE, too, would have blamed .." THE CAT " ..International Feline of Mystery and Evil Genius Par Excellence ...

https://www.google.co.uk/search?q=blofeld&ie=utf-8&oe=utf-8&gws_rd=cr&ei=uk6ZVpznLYr2UMP9oYgL

25:

" ..Um, would you like me to sing you the theme song to the 1959 TV show, "Robin Hood"? "

Wot? Really? This ..EVERYONE in the U.K. of my Generation - I am over 60 - could do that! Thus ..

https://www.youtube.com/watch?v=RIqM_Ku1uhc

And, also, we all would remember the National Anthem of the US of A s TV ..

https://www.youtube.com/watch?v=ogAIjqJbfi8

Scroll down through comments to ..

" I'm a 6'-5" blond bodybuilder...and even I feel threatened by this song. If Hollywood wanted to do something to make American Men hate themselves and obsess over the shallowest manifestations of masculinity, then this is how they did it. This surely was more effective at destroying the collective spirit and cohesion of American men than all the Marxist propaganda the MSM has ever been able to insinuate into their lives. This has to have been totally alienating/isolating to most men. "

But, better than both of these would be ..

All together now ...

https://www.youtube.com/watch?v=FaN9XHMXpG4

26:

"Do you have any tricks for keeping note of important details in previous books? I can't imagine that rereading the entire series or trying to keep everything in your head is really feasible right now."

The problems are exactly the same as those of trying to extend a fairly large and complex computer program. There are tools available for taking and searching notes, but the problems are (a) realising that you need a particular kind of tracking in advance and (b) converting more-or-less ill-defined concepts to precise characteristics in a matchable fashion. If OGH can solve those ones, there is a seriously large market for the IP.

That being so, I should also be interested in knowing any useful tricks, because of my even more aging brain (68), though I would apply them to programming, not fiction.

27:

I had a hard drive die last spring and was saved by automated nightly backups in my home network, and as a Software Configuration Management professional I needed to play with a recent version of Perforce...

My NaNoWriMo setup included nightly Perforce checkins of both the document and the notes files. With the workspace in my Dropbox area and the workspace and Perforce repository both backed up to Western Digital "myCloud" network disk server.

Perforce is overkill (unless you are used to it), but you might look at git, which is free and available on all platforms. (And be sure to back up your repository to some place outside its home system.)

A 'git add' followed by 'git commit' at the end of each work session could save not only your working files, but their changes over time, with a fairly efficient storage mechanism. If I want something that used to be in the main document or in one of my notes files but got nuked, I can get at it provided it was saved in git at some point between its creation and nuking. (So getting into a habit of doing a commit before you nuke a big chunk of something is a good idea.)

Dropbox for the workspace just made working on multiple machines easier: I could have done the equivalent by running multiple workspaces from Perforce if I hadn't already had a Dropbox account that was mostly empty.

Hmmm. I should write this up somewhere.

28:

Since Charlie has a history as a programmer, I wouldn't be surprised if he could tell you some git tricks. :-)

Seriously: He's not GRRM who uses computer archeologists for his setup.

29:

Nope, the most recent VCS I've used was rcs, back when I was using my home-rolled toolchain for writing books.

These days I just rely on Scrivener and Dropbox, with Time Machine for backup (data replicated onto two computers, each with two Time Machine target drives for alternating backup: Dropbox for offsite and sync).

30:

I lose the plot completely somewhere in the final volume—provisionally titled "The Alzheimer's Tract".

Clearly K syndrome.

31:

That'd be a neat way for OGH to cover up any major holes in the final story - make the narrator someone who's slowly losing it to K Syndrome (or even being a V syndrome victim), but covering up their increasing erraticness as best they can.

Has to end with the narrator's death, of course.

32:

That is fine for handling people forgetting old history, discrepancies in reported events, making old mistakes anew, etc., but is very irritating when used to cover up clear factual inconsistencies. Just as obsessive tidying up is - in anything beyond a short story, one expects a few incidental aspects to remain loose - life is like that. Mencken had it right :-)

33:

Firstborn had to read "Flowers for Algernon" as part of his English course at school - nearly had him in tears ;)

34:

Keyes' Flowers for Algernon and Asimov's The Ugly Little Boy (*) got me interested in SF. This genre is not just jet-packs.

https://en.wikipedia.org/wiki/The_Ugly_Little_Boy Television adaptation

  • 'In 1977, "The Ugly Little Boy" was made into a 26-minute telefilm in Canada ... The film is noteworthy for its fidelity to the short story... which has gained the film praise from both fans and reviewers.'
35:

On the other hand, Robert Jordan managed to keep the plates spinning long enough to run into another problem. His mid-WoT books started by checking up on all the major characters for a scene or two. That took about 700 pages, so there wasn't much room for anything else.

36:

>>>Speaking from experience (I've written a collaborative novel before) a collaboration between two authors is a process whereby each writer does 75% of the work for half the money.

Brothers Strugatsky, the most famous Soviet sci-fi author (consisting of two people), had a method which they found very efficient (for themselves). They literally sat in the same room and wrote their books together, one brother sitting at the typewriter and the other lying on the couch. They would go over each sentence together, before committing it to paper. This way, they claimed, really cut down on the editing, as the text was already "edited" by the time it was done.

Having read both the books written by brothers Strugatsky together, as well as the books each brother wrote separately, I can attest that all three types had their own consistent style, as opposed to most other collaborations I read, which always felt like a mish-mash of texts written by different people.

What I want to say, I guess, is that there probably exist more efficient ways of collaboration, which could yield more than 50% money for 150% work.

37:

The Strugatsky's process sounds a lot like pair programming. So yes, I'll concede that it's workable for some. The problem with employing it for writing fiction, I think, is that fiction pays so poorly that most writers can't earn a living at it at all, never mind enough of a living to keep two people (and, optionally, their families) fed.

If a couple could bootstrap their way to writing collaboratively, yes, I can see it working from the outset. And I know a few married couples who write together on the same projects (although how they split their workload is ... hmm, I should go ask). But for most of us it's a solitary occupation because unless we're literally living as family we can't afford the overheads of a shared workspace with a stranger (including one of us moving to be within commuting range of the writing partner -- writers tend to be spread thin on the ground).

38:

And like pair programming it probably only works if the pair share exactly the same vision of the finished product, down to the last dot and comma.

Anything less and someone gets killed, or at least maimed.

Not a fan of pair programming.

39:

>>>> And like pair programming it probably only works if the pair share exactly the same vision of the finished product, down to the last dot and comma.

Well, Strugatsky brothers were brothers, after all, so that probably helped.

40:

I use vim and git locally, but have started using

gsutil -m rsync -d -r writing gs://bucket/writing

for "$HOMETOWN struck by meteor; oddly, I survive" backup.

It's got to the point where there's a whole bunch of options. I could, for instance, pay $PITTANCE for remote git hosting somewhere, but that implies collaborators and comes with an unsuitable presumption of world visibility. So I'd rather stick to rsync in a cronjob.

Though if either copy editor ever expresses a willingness to adopt a git-based workflow I might rethink that. :)

41:

What I wonder is whether it is possible to implement division of labor in a collaboration of writers, beyond simple different writers writing different parts, then trying to glue them together.

I dunno, could you have different people responsible for the story universe, the plot, the characters and the puns*?

*Those are the four main ingredients, right?

42:

If your copy editors are professionals in the trade fiction sector, then they probably get most of their work from Big Five imprints or indy publishers. Whose idea of electronic/online document workflow is MS Word centric at the author end and InDesign centric at the typesetter.

43:

Your breakdown of division of labour implies far more modularity than actually exists. Written fiction is more like spaghetti code with nothing but global variables and computed GOTO as the only flow of control structure than anything that decomposes neatly into different modules and abstraction levels.

Having said that, there are shared universe works, mosaic novels, and ongoing multi-author continuities; look at the "Star Wars" or "Star Trek" fiction franchises, or in the German market, the Perry Rhodan series (aka the Series that Ate German SF). But these either require a benign dictator at the center, or they don't make enough money for multiple authors to make a living off ...

44:

>>> Written fiction is more like spaghetti code with nothing but global variables and computed GOTO as the only flow of control structure than anything that decomposes neatly into different modules and abstraction levels.

I won't argue with this statement, but is it the only way? You certainly can program like that, but there are better practices, as you surely know. Then again, a novel is a "program" that doesn't need to be maintained, so the code quality doesn't matter once it works.

OK, I think I am stretching the programming metaphor too far...

if you were to set out to build a team with the purpose of writing a novel, how would you distribute the work?

45:

A novel is a program that has to run on a huge and unconstrained variety of semi-compatible hardware, no two instances alike, interacting in unpredictable ways with other firmware loaded previously. Worse: the outcome of running a novel is unpredictable -- it's never experienced the same way twice even on the same platform.

For teamwork novels, I'd say the best current practice is the shared universe anthology. Someone invents a world, and sets up a bunch of common shared characters (much like designing a role playing game). They may optionally also design an outline for an overall story arc. Additional writers are then invited to come on board. They either write modules that advance the core story arc, using the central common characters, or they invent their own characters who perform stories off to one side of the main narrative thrust. A proviso is that other authors can "borrow" their characters as bit-parts in their own narratives (but usually making major transformative use of them -- e.g. killing them off -- is frowned on unless it's done by the "owner"). Finally, the world-builder or some other designated editor checks the stories for compatibility, sends them back with editorial advice if they're at odds with someone else's story, and then compiles them into a publishable form. This may be a series of books (e.g. George R. R. Martin's "Wild Cards" series) or some other medium such as Serial Box).

The problem is, management/coordination between authors adds an overhead that wasn't there when the work all went on inside one person's head. So it's actually less efficient than the small artisanal single-person model we mostly use today.

Yes, electronic tools for massively improving the efficiency of the writing process are possible. The problem is, it got automated in the late 1970s/early 1980s -- these things called "Word Processors" totally revolutionized everything! (There's room for further progress, but it's incremental -- for all that I swear by Scrivener, it's basically just an IDE for long compound documents, while your average world processor is like a text editor. The scope for huge advances in programmer productivity lay in the move from punched cards and assembler programming to interactive time-sharing terminals with text editors and a CLI and compilers, not from text editor/CLI/compiler to IDEs.)

46:

One is, one isn't (though is for things other than trade fiction) but both were quite delighted at the prospect of plain text files. (One good thing I will say about Apple; plain text UTF-8 isn't a struggle for Mac users because it's not optional, either.)

The problem would be getting them to use git, and hosting it, and otherwise going to more trouble than the gubbins are worth since I have to do the merge in any case. (If it was movie scripts or something where the writer and the person owning final approval are distinct, and writing is more dynamic, I might think it more worthwhile.)

47:

Um, in programming, isn't there a "Brooks' Law": "adding manpower to a late software project makes it later." Why shouldn't that apply to writing fiction?

Brooks' law rests on the need to get workers up to speed before they can be productive, the combinatorial time costs of people having to communicate with each other (adding one more person adds n+1! to the communication costs), and the limited divisibility of tasks.

I'd suggest that adding people to the writing process has the same core issues, plus the lovely feature that most people won't pay more for a fiction book that was written by two people (or 20 people) than they will for one person. So unless people can more than double their sales by writing together (say JK Rowling and James Patterson collaborating on a series, to bring together their hypothetically non-overlapping fan bases and then some), it's not worth the trouble.

Speaking of Patterson, it's too bad he hasn't died yet, so that someone can spill the beans on the complicated commercial operation he's undertaken to corner his part of the bestseller market. I don't mean this in an ill-wishing sort of way, it's just that, in my ignorance, I don't think he's talked much about what he's doing, and it's obvious that he's doing quite a lot. Apparently it's profitable, and that's about as much as we can say, until and unless someone spills the beans on his Master Plan.

48:

Hosting it shouldn't be a problem, as long as you have an ISP who isn't an arsehole, ie. one that gives you a static IP and doesn't block ports. An old Pentium box sitting in the corner running Linux and a domain name would allow you to host it via your existing broadband connection for beer money.

49:

Hosting isn't a problem.

Justifying the extra effort for the (possibly negative) expected benefit is pretty tricky, though; it wouldn't improve the process notably, and it would involve adding a layer of weird from the copy-editors' perspectives.

Sending out a text file and merging the text file I get back against the original isn't very complicated, I have to do it, and all the version control can live with me. If it was important to track individual copy-edit changes or if there were multiple copy-editors per work doing alternate chapters or something I could see more process being worthwhile, but that's not true for me so I don't have a good feature story. (From an editor's perspective, "git is cool" gets rapidly refuted with "git is weird and annoying" :)

(I'll believe in Strong AI when someone demonstrates unit tests for fiction.)

50:

Speaking of Patterson, it's too bad he hasn't died yet, so that someone can spill the beans on the complicated commercial operation he's undertaken to corner his part of the bestseller market

The New York Times had a profile of Patterson a few years back. It explained how he comes out with a half dozen (at least, it seems) novels a year. Most of those are co-written, that is he writes a synopsis, and the author with their name in small print on the cover does the writing of the actual novel. Meanwhile Patterson writes one of his own books, and rakes in the royalties from all the various series he has going, everything from crime novels to YA Vampire books, at lest one on the NYT bestseller list at any time.

51:

Of course if he did die he might end up as another V.C. Andrews, who "wrote" at least a dozen books after she died. His estate could easily assign one of the co-authors to take over and ghostwrite under his name.

52:

Note: even if you do plan everything out in huge detail for a whole series, it's otters all the way down once you actually start writing it. I was just doing the typing revision of a perfectly reasonable older draft of a manuscript, only to have me realize 'oh, hey, I should fill this plot hole...'

Oops.

53:

a one-shot short novel written with no thought of sequels

Eric Flint has made exactly the same point as OGH, in his case with respect to "1632". He stated that the demand for a sequel or 6 delayed his finishing another series which he HAD planned out in advance.

54:

"The scope for huge advances in programmer productivity lay in the move from punched cards and assembler programming to interactive time-sharing terminals with text editors and a CLI and compilers, not from text editor/CLI/compiler to IDEs."

Yes, absolutely - I remember well. The earlier (though later in the 'PC' world) massive improvement was with (true) operating systems, where you didn't have to reboot every time you made an error, which is a bit like the difference between 'takes' in filming and the equivalent in writing. So it already has that. As you say, IDE's are merely a GUI/whatever for tasks that can be done in scripts, as many developers do. The other really big productivity improvement was to higher-level, mistake-resistant, more checkable languages, though the overall progress in that in the programming arena more-or-less halted several decades ago; that's very like the difference between Euclidean geometry and calculus in science.

And moving to a better, higher-level, language is precisely what is NOT available for fiction. I can just see a book being marketed, with the blurb saying that you will have to learn Theoglossa to read it, and that bears the same relationship to Esperanto as calculus does to Euclid's Elements!

55:

Speaking of version control systems for authors, here's my take on the features authors need:

  • Ability to ssve a snapshot/version at will: vital

  • Ability to retrieve a previous version: vital

  • Ability to selectively merge bits of a previous version, overwriting replacement edited bits of a current version: vital

... And that's all.

Stuff that is nice but rarely used:

  • Ability to fork the project: very rarely needed (I can think of one instance where it might have been useful in my entire career)

  • Ability for two (or more) people to work on the same project consecutively: rare (maybe 5% of books have multiple collaborators, and see Graydon's notes about creative new ways to annoy copy editors)

  • Ability for two or more people to work on the same version concurrently: this is a NEVER HAPPENS situation and if it occurs there will be tears before teatime, UNLESS we're talking about a weird-ass "pair programming" approach as with the Strugatsky Brothers -- this is rare even among collaborations, which are in turn rare.

Deploying git or subversion or something of that ilk to keep track of a novel is like deploying a 747 to nip to the corner shop for a pint of milk for your tea. sccs (blech) or rcs work just fine (as long as it's a plain text format) and cover all the bases (as long as you never, NEVER, put your sccs repository on an NFS share (I have horrid memories of this happening and the hilarity that ensued)).

56:

The other really big productivity improvement was to higher-level, mistake-resistant, more checkable languages, though the overall progress in that in the programming arena more-or-less halted several decades ago;

I don't think progress in language design halted. Caveat: I'm not truly current, but my understanding is that functional languages with strong stating typing are really solid in terms of producing code with no unwanted side-effects (if difficult for the programmer to wrap their head around). Meanwhile, scripting languages from Tcl and Perl through to Lua and Ruby by way of Python are vastly more expressive and concise than the stuff we were using in the 1970s and 1980s while not being intrinsically hard to learn and retaining enough simplicity and elegance that, for example, MIT switched to Python from Scheme for introductory programming courses. (You could do all the metaprogramming stuff the lisp-ers loved while also teaching OOP and procedural styles and a whole bunch of other stuff in a language that's actually used in the real world.)

The real area where progress stopped was operating systems research, as per Rob Pike's infamous paper.

57:
Deploying git or subversion or something of that ilk to keep track of a novel is like deploying a 747 to nip to the corner shop for a pint of milk for your tea.

The reason to use git (or bzr or hg) rather than sccs or rcs is that it's what everyone else uses; it's a live, maintained project, with a community and resources to help you along, compatible with modern everything. Unless you're already familiar with sccs or rcs, you'd never use them for a new project.

It's more like using a Ford Focus to nip to the corner shop, when a Model T would work perfectly well...

By default, git doesn't need a server — it can happily work locally if that's all you need.

(BTW, as far as collaboration is concerned, there's also the situation where it's one person with multiple computers; laptop and desktop, or similar.)

58:

Yes and no.

There are lots of nice high level scripting languages about, but none of them offer any real advantages over the others except CV bragging rights.

Functional programming features are gradually leaking down from academia and being bolted onto procedural & oo languages in ways that would horrify their inventors. Mostly syntactic sugar.

There are some pragmatic functional languages about, but most people run into haskell and swear off the concept for life unfortunately.

I have been lots of small incremental improvements in language design over the last 20 years but they don't add up to anything earth shattering.

A bigger thing imho is the massive progress in static analysis. Analysers are actually useful now, and they find real and non trivial bugs.

In any case, your comment about gotos got me thinking that at a low level all code is an unstructured soup full of gotos, and an "optimiser" can easily make it less structured.

Whicl leads me to propose an LLVM based novel compiler.

59:

"I don't think progress in language design halted. ..."

I didn't say 'design' and didn't mean that. Prhaps I should have explained that I was talking about its overall use in the 'real world' and not in 'computer science'. The problem is that, in the former, for every step forward, there have been two steps sideways and one step back. You raise two points.

Yes, the best 'computer science' languages (not just functional ones) are vastly better - the trouble is that they almost always assume and even require particular types of programming that are often infeasible in the 'real world'. The classic example is languages without proper multi-dimensional array support (i.e. pretty well everything except Fortran and, to a lesser extent, Ada, Matlab and Julia) used for problems which rely heavily on matrix algebra. Partly because of that, but far more because they require much higher skill set and a lot more self-discipline and logical design to use well (not compatible with employing underpaid chimps as programmers), they have almost all crashed on take-off in the 'real world'. Haskell is the main (and very localised) exception.

Computer science and practical software development went separate ways in the 1970s, and are showing few signs of converging, though at least they are now talking to each other (which they used not to do). For example, in the 'real world', C/C++ variants and toolkits have taken over as compiled languages, and even the best C++ ones are regressions on Fortran 90 and Ada 95 from a software engineering (correctness, RAS, security etc.) viewpoint, let alone the best 'computer science' languages.

Yes, Python etc. are streets more expressive and concise than the scripting languages of the 1970s and 1980s (even the ones I used, which were much better than any commercially available). But, from a software engineering viewpoint, they are pretty ghastly. Indeed, from that viewpoint, they aren't even as good as modern Fortran and Ada. And let's not consider their performance and resource requirements, which are usually critical for HPC and embedded uses. And that ignores the fact that the majority of scripting is in languages like Java or, far worse, ones like PHP.

60:
And moving to a better, higher-level, language is precisely what is NOT available for fiction. I can just see a book being marketed, with the blurb saying that you will have to learn Theoglossa to read it, and that bears the same relationship to Esperanto as calculus does to Euclid's Elements!

Well, presumably that's where the division of labour comes in — there would be the language equivalent of inkers, translating to English.

After all, drawing a picture is a similarly one-person task, yet when the business side calls for it, it can be split up between a penciller, one or more inkers and a colorist. Presumably a similar split could be done with writing, if there was enough of a call for it.

Or if some of the parts could be done by LSTM, as the equivalent of a computer language compiler.

61:

The Big Thing in programming today, especially for the web, is automated testing. It's structured, verifiable, can be run at any time during the coding process and it can even look forward past the point of a given nightly code commit to warn of possible future coding dead-ends that will impact the final product (mutation testing).

All the major coding systems have test wrappers of varying complexity and utility but the web languages (PHP, Java etc.) seem to have the largest ecology of frameworks to the point of there being specific "languages" used to create tests (usually based around JSON).

Maybe the Scrivener coders should start thinking about writing a test harness system for writers; they could call it Marty.

62:

There are some pragmatic functional languages about, but most people run into haskell and swear off the concept for life unfortunately.

I convert text files to EPUP3 using XSLT. It's surprisingly straightforward.

(I have put in about fifteen hours on the transform. Run time seems nicely linear at around two seconds per hundred thousand words. Every time I encounter people tearing their hair about converting from Word or similar I have to remind myself just how hard it is to convince people of the utility of form-content separation (in a business context where they are already required to do it) so I remember to shut up.)

Though people run into XSLT and express distress, too...

OCAML does seem to have adherents; I don't know if it counts as properly functional.

63:

EPUB. EPUB3. The nice, newer ebook format that you can validate without summoning anything dread from the Great Beyond.

(Spelling. We does not haz.)

64:

So they have finally woken up to the fact that that saves money, have they? I first did that in a production context over 40 years ago, and have been using it and trying to promote it ever since. But what you miss is that writing a thorough test suite is several times as much effort as writing the code, and almost all of it is specific to the requirement (and usually the code, too). Generic test suites test only the simplest errors, which take a negligible proportion of the debugging effort to find and remove.

We already have spelling chequers, there are some crude syntactic checkers, and there are some semantic checkers being worked on (in research communities). But ALL of them fail dismally when faced with the complexity of the language used by specialists, including even a lot of correct but unusual prose. Something could be developed that would pick up quite a lot of the simpler continuity errors, like resuscitating the dead, but detecting the subtler ones that are most likely to slip through a human check is a long way beyond the state of the art. And, of course, they generate reams of false positives once the prose got beyond their assumptions.

If you have access to any, or even spelling checkers, try putting some 18th (or even 19th) century literature, technical documents, or the more exploratory science fiction and fantasy into them. I occasionally try on my (correct) documents, just to see if there has been any improvement since their inception, and have invariably got the answer "no, not really".

65:

Ocaml isn't bad. It has a nice balance between "purity" and pragmatism.

I appreciate Haskell on an aesthetic level but place more value on the dirty business of getting things done.

66:

"Getting it done" could mean either "getting it shipped" or "getting it to work correctly"; my remarks pointed out that the latter has made little headway in the past 3-4 decades in most of the IT industry.

67:

Yes, Python etc. are streets more expressive and concise than the scripting languages of the 1970s and 1980s (even the ones I used, which were much better than any commercially available). But, from a software engineering viewpoint, they are pretty ghastly. Indeed, from that viewpoint, they aren't even as good as modern Fortran and Ada.

Caveat - I haven't done much Python, my recent experience has all been in Tcl (the industry standard for electronic design tools).

However, you're complaining about the ghastliness of scripting / interpreted languages, then comparing them to compiled languages. Apples and Oranges.

As for "even the best C++ ones are regressions...", I respectfully disagree. On their own, I might argue the point - but when taken in conjunction with the standard template library and Boost library, they take "computer science languages" and stamp all over them with tackety boots.

No researcher is going to feel as much of a glow from "extending an existing language" as they are from "At Last! The One True Language to Prove My Thesis". So you end up with lots of narrowly-useful demonstrations of theory, but no rigorous effort to bind them into a common usable form. As you say, C++ is having a damn good try (C++11 is rather nifty).

The advantage of C++ combined with STL and Boost is that the boring, low-level, easy-to-make-subtle-errors stuff has been turned into something that "just works". Want a container for your objects? Singly-linked list? Map? Queue? No problem, just use the STL type. Boost::Filesystem? Cracked it. Timezones, locales? There for the taking. Having to roll your own data structures / IO / OS interfaces for every new project, however elegant the underlying language, will always expose you to errors.

As it happens, I have seen genuinely innovative design languages recently - my Digital Comms Lecturer decided to move into industry, and look at domain-specific languages designed specifically so support networking operations. A very small group of us implemented a tool to support it, and we saw order-of-magnitude improvements in compression. 3000 lines of HDL had been expressed as an equivalent 300 lines in the existing toolchain, and could be expressed as 30 lines of Click. His language has now been released in a tool that (even in prototype form) saw the lead customer abandon their "parallel project as a control group" and complete the project very early using the new stuff...

To me, you sound a bit like the computing equivalent of thst mature and experienced type who claims that "the popular music these days is rubbish, you can't understand what they're saying, it's too loud, and they wear silly clothes; nothing like the music back in our day". There's some truth to the claims, but some convenient forgetfulness about the rubbish that everyone had to put up with back in those "good old days"...

68:

We already have spelling checkers, there are some crude syntactic checkers, and there are some semantic checkers being worked on (in research communities). But ALL of them fail dismally when faced with the complexity of the language used by specialists, including even a lot of correct but unusual prose.

I disagree. We were using Coverity and Bullseye as tools; run nightly against the production wavefront of an extremely large (and unfortunately overcomplex) C++ and Java codebase.

Coverity is more a static analysis tool but with some dynamic consideration; sort of like "Lint on steroids", but acting as a workable IDE for a large project team.

Bullseye is a dynamic analysis / coverage tool; it monitored how our suites of unit and regression tests executed the codebase, and provided figures around which function calls were being executed, and which branches taken or not taken. This gave us a good idea about where our code coverage was lacking.

Consider also Rhapsody - code autogeneration from UML diagrams, based around David Harel's work on statecharts, used in a rather demanding live production environment for over fifteen years now (radar data processors).

These are production tools, out in industry for well over a decade (I was using QA C twenty years ago). No, they don't "fail dismally", unless you've been sold them as a silver bullet - they're very helpful when used correctly.

69:

Look, I know C++ pretty well, use it, taught it, was active (though not a principal) in the development of C++11, and could go into arcane specification and implementation details of the STL and (worse) Boost, including all those aspects you mention and more. I am also a Python user, including when it comes to language extensions (both in itself and in C). It's not what you can do with them that is the problem, but the way that it is so easy to trip across one of their restrictions. Indeed, when it comes to advanced use (including parallelism), the C++ standard is so ambiguous and even inconsistent that there is no consensus on precisely what it means. That is really bad news for correctness, portability and RAS.

I shall now identify myself very clearly to Those In The Know :-) I take it that you are not au fait with modern Fortran or Ada? I am, with one of them, and am uniquely well-qualified to compare that with C++, especially from a correctness, portability and RAS viewpoint. Yes, it's arrogant, but identify me and you will see why it is true. My assertions stand.

70:

''No, they don't "fail dismally", unless you've been sold them as a silver bullet - they're very helpful when used correctly.''

Exactly HOW would those help to check English prose for syntactic and semantic correctness and consistency, which is what I was talking about in that paragraph?

Incidentally, I was using tools like that well over 20 years ago; indeed, I have been involved in the design of a few. I could go into what they will check and what they will not, but that was and is not my point, which was was about written English.

71:

Damn. That's misleading. I should have said "Indeed, when it comes to advanced use (including parallelism) of the STL, the C++ standard is so ambiguous and even inconsistent ..." Parallelism in the language itself is mostly well-defined and consistent, even if truly evil to use correctly; Hans Boehm and others (including me, as a very much lesser light) took a lot of trouble to ensure that was so.

72:

>> I shall now identify myself very clearly to Those In The Know :-) I take it that you are not au fait with modern Fortran or Ada? I am, with one of them, and am uniquely well-qualified to compare that with C++, especially from a correctness, portability and RAS viewpoint.

Have you tried Rust already?

73:

"Have you tried Rust already?"

No. I was seriously put off by its claims and the lack of a specification, but I mean to look at it sometime, and see where on the spectrum from my expectation to its claims it comes. I absolutely LOATHE having to reverse engineer specifications from programming guides, as it makes it impossible to be sure whether a problem is your mistake, a bug in the implementation, or a defect in the design.

74:

Dropbox is already deploying code written in it, if you like arguments from authority. :-)

75:

I take it that you are not au fait with modern Fortran or Ada

You're right, I'm not. As one lecturer jokingly put it: "BASIC is a language written for children; C is a language written for grownups; ADA is a language written for hardened criminals"

Our radar had about 40 Ada programmers working throughout the 90s and early 00s; for all that their department head claimed that "Ada was a specification language", their defect rates and output were in no way better than our 40 C / Assembler programmers next door.

I may have been unimpressed by the support for the Ada language; from the mid-1980s when there were only two compilers (DoD and Edinburgh University, and DoD was the buggier one) to the 90s, when our ADA team took on Telelogic Tau as their tool. Followed by Telelogic's eventual decision to discontinue support, "but if you give us $100k we'll fix the bugs you found, we denied, and then you proved were present in our compiler".

While there are undoubtedly a lot of "explicitly undefined" behaviours in the C/C++ standard, at least there are some benefits to having the GNU C++ compiler competing with the Microsoft C++ compiler :)

Lest you think that I'm an unquestioning fanbo1, I will confess to being a fan of Les Hatton's "Safer C" stuff since the late 1990s - his attempts to bring some measurement rigour to the wilder claims of process and sales types were refreshing... his website is well worth a visit, the papers are excellent.

76:

Exactly HOW would those help to check English prose ...which is what I was talking about in that paragraph?

My apologies, when you said "Language" in a comment that replied to another that was 2/3 discussing Programming Languages... ambiguity, got to love it.

Partly because of that, but far more because they require much higher skill set and a lot more self-discipline and logical design to use well (not compatible with employing underpaid chimps as programmers)

This bit, however, grates. Yes, I'll complain about poorly-trained Java programmers with no grasp of what really lies under the abstraction. But that blames the training, not the individual.

Ease of use and maintainability are key measures of a programming language. We don't insist that every driver be capable of driving an F1 car, or insist that all cars have F1 performance. Providing an easy-to-use language with acceptable failure modes is a plus, not a minus.

I'm currently in a new job, trying to decode the scribblings of an obviously-capable Boost contributor who thought that any code comments or design notes were beneath his stellar software experience. Unfortunately, while his arrogance matched his intelligence, his infallibility did not - and I'm trying to pick up the pieces.

Your comment about "underpaid chimps" therefore makes you look less of a Gentleperson than I hope you are.

77:

Agreed.

One advantage of git (etc.) over older tools like rcs and sccs is that it has an active user and development community: if you run into problems or need to do something unusual, you are more likely to be able to google fixes for something current. And if you are an individual author working with straight text, all of the tools are pretty interchangeable.

If you are doing something multi-media, or thinking about doing games development (anything where you may be dealing with versions of binary images and other file formats in addition to text files), there are reasons Perforce is used by Pixar and many of the big game development companies. And it is free for up to 20 users or 20 workspaces.

I used Perforce, not because I needed the full functionality, but because I wanted practice with setting up the client/server configuration with a recent version, and setting it up for my writing environment killed two birds with one stone.

78:

''Your comment about "underpaid chimps" therefore makes you look less of a Gentleperson than I hope you are.''

That was actually a blast against the Senior Management. They pay peanuts, get chimps, and then expect actual humans to pick up the pieces for extra large, dry-roasted peanuts. Been there - done that - as it is clear that you are doing.

I would dispute that ease of use per se is a good criterion, because its only real merit is that it allows Senior Management to employ chimps. A better criterion should be productivity / ease of use - i.e. a hard to use language should be extremely productive to justify its existence. I accept your point that the fault is often poor training (blame Management, again). Maintainability as a criterion, I agree with.

79:

And an older comment was that C is a hacker's wet dream. As someone who spent 15 years being very active on WG14, and to blame for some of the features of the language, I agree with that. As someone who has written hundreds of thousands of lines of assembler, knows the C standard very well indeed, has implemented it on a hostile system, has written codes that work on systems as disparate as System/360, ICL 1900, VAX and CDC, and has programmed down to the interrupt handler level, I can write portable and reliable C. But, even given that, I can't teach anyone else how to do it in a reasonable length of time. Yes, it's that bad.

80:

"As someone who ..."

I may have been unclear, again. That wasn't meant to be bragging, but was attempting to describe the prerequisites a programmer needs to write seriously portable and reliable C. Yes, seriously.

81:

But, even given that, I can't teach anyone else how to do it in a reasonable length of time. Yes, it's that bad.

IMHO it wouldn't matter what language you used - it's a nasty problem, that needs good basic understanding and training. It's a bit like parallelism; some people take a long time to get it, and without a good grounding in the machine layer you're doomed...

Fundamentally, it's about good design - and only then about "what about the language features". The people who write dangerous C will quite happily write dangerous ADA, because the problem with foolproofing is that fools are so damn ingenious...

How much is it that the time necessary to mentor and develop young software engineers just isn't there?

The "we need to invest in the medium-to-long-term training" takes a battering after every recession, as everyone (particularly US listed firms with their fixation on quarterly results) focusses on short-term gain for reasons of survival. Of course, with the excuse of "we can do that stuff later, we just need to get through the next two years or there will be nothing to mentor about"

By contrast, I had the time and top-cover in the mid-90s to mentor my team to deliver embedded real-time C to the standard that I wanted, and the customer needed - while also taking the time to teach design, not just coding. And I wasn't alone in our lab.

Being an effective trainer is difficult enough - it also needs an environment in which the trainee is willing to listen. My assertion was that if you took a graduate and taught them correctly for the first two or three years, you had a chance; but if you didn't, there was the risk that they had already developed "but I'm an experienced engineer, I know what I'm doing, who are you to tell me how to..."

I left one job because my manager was certain that he knew best; after all, he had a Ph.D and fifteen years experience - but had made the jump to "I'm a manager, I don't have time to do delivery" too soon. While he knew the problem domain backwards, he had never worked next to more than five or six other engineers, or for more than one company, or in anything other than a very "process light" environment, or had any real experience at system architecture or design. So, big blind spots and total self-belief.

82:

I note that there was a recent blast by djb over how the C language is almost impossible to use correctly, and that anyone trying to do cryptography in it is sooner or later going to be hit by that.

(Yes, one knows what this code does on this platform, but on that other one? You Elderly Cynic of course know this)

On the subject of Rust - as a language, it looks really quite nice, in that it does try to make a lot of bugs impossible without explicitly writing unsafe code blocks. It picks up some interesting ideas from all over the place, but looks to me a little like that slim language that's trying to escape from C++. A language which doesn't so much forbid many dangerous constructs so much as make them impossible is a nice idea.

83:

(I will just confirm that I have seen Elderly Cynic on the WG14 and then latterly the WG21 mailing lists. Those are the C and C++ Standardisation groups)

84:

THIS!!

I used to have a manager who "wrote Fortran in Ada" (he ignored all types except integers, floats and arrays (include strings as array of character), did "magic numbers", used GOTOs (yes, that's right, GOTOs in Ada), ignored defined exceptions...

85:

That is old Fortran, and has been superseded since 1990; almost all current code is written in the modern language, which is very comparable to Ada in many respects. Note that it can be taught and learned to the same level of usefulness (but for different styles of programming) in 20% of the time of C++ - and that's measurement, not just theory.

87:

Absolutely. I was not pretending that I am alone in my understanding of C, but people like djb me are very much in the minority. Thanks for your comments on Rust - I will give it higher priority - and I really, really agree with your last sentence. Thanks also for confirming that I really do exist :-)

88:

I used to be quite keen on SISAL. Pity it never really took off, but it's probably impossible to make serious inroads against something as entrenched as FORTRAN. Netlib alone is almost a good enough reason to stick with it.

Reforming FORTRAN was really the only way to go and what little I have seen of newer code is quite acceptable. Unfortunately there are a surprising number of people out there who insist on using 77 or worse, and a lot of them teach "the programming module" at universities.

On the C++ front, you are probably right. I don't think anyone except Stroustrup really loves the language, but again you have network effects that make it the rational choice anyway.

89:

"On the C++ front, you are probably right. I don't think anyone except Stroustrup really loves the language, but again you have network effects that make it the rational choice anyway."

Regrettably, most of the people active in WG21, Boost and similar projects love its uncheckability and loose specification, claiming that those are the only way to get performance and flexibility. Bjarne is not one of those, incidentally.

90:

I really enjoy Rust so far as a newbie whose first language is Python and who wanted to learn some low-level programming but was intimidated by C/C++. Compiler error messages are amazingly helpful.

91:

Much architectural diversity, very headdesk, wow!

(Sorry, couldn't resist.)

((Presumably the fact that M4 is supposedly Turing-complete doesn't hurt.))

92:

Actually, I have never used M4 and much of that was before those days, anyway. But you are quite right - it's the sort of battle-hardening that leaves one either shell shocked or intensely paranoid, or both.

93:

Presumably the fact that M4 is supposedly Turing-complete doesn't hurt.

M4 is a lovely tool for building structures on top of text files. It's Turing complete in the same way that Minsky register machines are - "you don't want to use this to do that" - but it offers some fantastic ways to quickly build powerful intermediates.

At one point I used it to drive dot to generate project plans and work breakdown structures from the same file...

94:

We used M4 to build up Assembler on a small project I worked on fifteen years ago; we were engaged to write the microcode for a telecomms ASIC that was among first to market for 40G, and certainly one of the first to support the OTN protocol.

The various macros contained assembler and some programmability at the bottom layer, then we built up from that. A certain amount of brain bleed was involved, but the customer was happy; it all worked, and AFAIK is providing backbone optical connections all over Asia...

95:

Has anyone since Chekhov had the sheer nerve (though) to leave a gun on the mantelpiece in the first act, fire it in the third- but (well, maybe read Uncle Vanya's third act, or Cliffs Notes for...) - ah, alright- and to -miss-? Anyhow, actually hitting anything might be required by one's plot situation, but not by C.L. :)

Specials

Merchandise

About this Entry

This page contains a single entry by Charlie Stross published on January 15, 2016 12:11 PM.

Long range forecast was the previous entry in this blog.

The paranoid style in 2016 is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

Search this blog

Propaganda