« How I got here in the end, part six: the second startup death march | Main | The story so far »

How I got here in the end, part seven: bubbling freelance

In general, having your employer go tits-up at the end of the month without a pay packet in sight is a Bad Experience. I couldn't, in all honesty recommend it ... but if it's going to happen, it's best that it happens in the middle of a tech bubble that you're part of.

Here's what happened to me between the collapse of FMA Ltd and the formation of Datacash Ltd ...

I'd been registered as part-time self-employed with the Inland Revenue for years, thanks to the writing sideline, so I had half a clue about handling expenses and organizing invoices. And the summer of 1996 was about twelve months into the dot-com bubble, and I was one of the more experienced perl hackers on the market (the whole area of CGI programming having been embryonic just 18 months earlier). Some weeks before I maxed out my credit cards I landed some freelance programming work rewriting the web-based remote administration interface for a standalone email virus scanner called WebShield that a sometime acquaintance from Oregon had thrown together and got McAfee to sell. WebShield was (in those days — I have no idea what it is today) a standalone Linux box with two ethernet ports, one going in and one coming out: in the middle sat a ferociously paranoid distro with a firewall. One port open to the outside world (SMTP) and two open ports on the inside — one for SMTP traffic, and one for HTTPS to allow the users to configure the box. It took me a while to get the job done (it wasn't tiny), but it also enabled me to get back on my feet and start looking for more work. I had, in short, temporarily mutated into a contractor.

That sort of thing makes you re-evaluate what the hell you're doing with your life — not in as soul-searching a way as cancer or the death of a loved one, but still: I knew I wanted to be a writer. But firing the odd feature article at Computer Shopper wasn't where I wanted to be. I abruptly realized that I'd managed to sell one short story in 1995, and it was an anthology reprint of a story I'd previously sold in 1994 — I hadn't done anything original. In fact, during the period 1993-1995 my fiction output dropped almost to zero.

I took a hard look at what I'd been doing since 1986, when I first came out of university, and realized I'd made a complete balls-up of just about everything. On the other hand: I'd learned quite a bit that they don't teach you in academia. I now had some idea how the publishing business worked (and could account for exactly why my first attempt at climbing a well-understood ladder in the SF field had failed). I came up with a Plan, for the fiction side of things. Which was: (a) write stuff, both short stories and novels. (b) Send them to America, because that's where the money is — the market being a whole lot bigger. (c) Start at the top, with the big name magazines (Asimov's, F&SF, Analog) rather than at the bottom. (d) Write novels. Make each novel potentially the start of a series, but don't write the sequels. Make each novel different, using different sub-genre conventions, so that if one type is unpopular another can sell. And (e) get a literary agent in New York who will target the US market.

Accordingly, while working on WebShield, I began work on a new novel in my spare time — and some short stories. I'd begun it while at FMA, after the web book was finally out of the way, but made slow headway due to, well, working in a start-up. The first draft took about 15 months to emerge, and I completed and polished it in early 1998 before sending it off to sit on an editor's desk. It's title at that point was "Festival of Fools"; when it was finally published, years later, it was changed by editorial fiat to "Singularity Sky". I also wrote (or rather finished — I'd begun it at SCO) "Antibodies" and wrote "A Colder War" around this time. I can't claim to be totally consistent; despite deciding to target the big American magazines, I punted those two at a small British mag called "Spectrum SF", largely because the editor was asking me for stuff. However, those two stories got Gardner Dozois' attention — in addition to editing Asimov's SF he was editing the seminal "Year's Best SF" anthology series, and read everything that saw print. So the effort wasn't entirely wasted ...

By the time WebShield was finalized, I had more work on the horizon, and had decided that I kind of liked this whole contract programmer gig. Lots of time being gainfully unemployed (you need to use it to put the hours in studying, because your employers won't pay for you to do it on the job) interspersed with periods of frantic but ridiculously well-paid work; it's a bit like being an actor, even down to the requirement to deliver your lines perfectly. (Lines of code, that is.) On the other hand, my next job took a bit of the shine off the apple: it was a specialized programming job for Demon Internet.

Companies have public images that are very different from the actual internal truth. Demon was seen by outsiders as being a bunch of clueful techies; I guess most people imagined a bunch of hairy hackers in jeans and tee shirts feverishly bolting together racks of expensive electronics in a converted warehouse. AOL, in contrast, was seen as corporate, suit city. The truth (in the late 90s — I can't speak for current practice) was almost exactly reversed; AOL was almost Californian in it's laid-back ways, while Demon was all but running on the old IBM dress code. It was also, by early 1998, a hive — offices bulging at the walls with staff, mostly hired in the previous 12 months. Lines of responsibility weren't 100% clear, and, as with all businesses that grow too damn fast, folks don't talk to each other all the time.

I was hired by arch tech-head Clive Feather, evidently off his own bat (as far as I recall), to write a web front-end for the Internet Sign-Up wizard in Microsoft Windows 98. Not being a Windows user (I was 100% Linux/Mac at that point) wasn't a problem; this was the back end stuff to process new sign-ups. I'd met Clive back at SCO, where he'd arrived by way of some start-up operation in Cambridge to be the personal IT fixer for the executives; now he was one of Demon's managers. I got the basic coding job done, stuck a rather ugly web form on top of it as per spec ... and on my last day, got summoned into an office and carpeted: one of Demon's other princes of darkness had finally learned about what Clive was doing, and hit the ceiling.

You know the joke about how the reward for a job well done is another job? The one month contract stretched on for two, and then three months, as the customer-facing interface was handed over to Demon's web department for redesign, and I was hauled back in to re-code the templating system I'd written (in a half-assed way, the monicker "templating system" not really having been invented for web apps at that point). Worse, this job needed to be done at Demon's web operation's HQ, which — Hendon being full — had been established on an industrial park down in Dorking. In a suit, in a heat-wave, commuting from Edinburgh.

After spending a couple of months there, it is my opinion that there is nothing wrong with Dorking that can't be fixed. By landscaping. With strategic nuclear weapons. (Think: sleepy little commuter dormitory town within a half hour train ride of London. Think: sleepy little town with nothing going on after 11pm except for the local youth stumbling drunkenly out of the pubs, throwing up, and stabbing each other. Think: twitching net curtains. Not my kind of place, in other words: I'm a city boy at heart.)

Anyway, the job got done. I was, by that point, heartily sick of it; and not merely of Dorking. Being caught in the gears between two dueling management chains is not the best place for a contractor to be, because everybody is acutely aware that you are paid more than they are (being paid by the hour — they don't notice all the hours you rack up in between jobs, chewing your fingernails and looking for work resting), and you're an outsider.

In this case, the clash of ideas at Demon — between Clive's neat gee-whiz tech idea of how to implement an interface to Microsoft's sign-up service, and the web and design folks' idea of how to implement a public-facing front to Demon's corporate look and feel — had me caught in the middle like a bug in a gearbox. (Plus, I found that the corporate culture made me itch: you can put the hippie in a suit, but you can't put the suit mind-set into a hippy. Not that I'm a hippy, exactly — more like a somewhat oblate goth — but the point stands.)

Clearly if anything is wrong it's most convenient to point the finger of blame at the guy they don't have to work with every day. And while big corporate consultancy businesses like EDS or CAP-Gemini or whoever have corporate contracts departments to handle dispute resolution, there's a simple way of dealing with the solitary contractor: you fire the scapegoat's ass. I did not get fired; but I didn't ask for follow-on work, nor was I offered any.

Besides, I was about to become busier than a very busy thing indeed. Because that's when Dave MacRae emailed me out of the blue, and suggested we meet up and talk about a little business idea he and a co-worker were hatching, all to do with this new field called e-commerce.

(To be continued ...)

|


55 Comments

user-pic
1:

Hmmm. In the comments thread on one of the earlier instalments in this series, someone (it may even have been our esteemed host himself) mentioned 'cargo cult computer programming courses' in UK universities. I raise this point because in last sunday's Observer there was a column calling for more investment in IT training in the third-level sector, and less of an emphasis on the arts and humanities sectors. Now, having read each part of 'how I got here in the end', and the ensuing comments threads, it seems to me that the key to a successful IT career is the acquisition of keenly developed critical thinking skills; something which arts and humanities at their best can provide. I'd be interested in people's views on this one; should the third-level education system be shifted towards an emphasis on IT and so forth, or would that be just another exercise in 'cargo cult'-ism?

user-pic
2:

FWIW, you are entirely right about Dorking. Living there for six months was just about the worst time in my life to date.

3:

Thank you for your wonderful description of my home town, it has made my morning.

There's something deeply depressing about going home and realising that everything stops at 11 (well some pubs are open till 1 now but only if your underage as far as I can tell).

Thanks for making my morning!

4:

@1 - key phrase there being "at their best". I've gone through formal techie/engineering education up to Uni level, then shifted to a BSc-equivalent in humanities. Both of those were for the funny pieces of paper that you get at the end. What actual useful knowledge and critical thinking skills I acquired came along "on the street". I have yet to meet an arts and humanities teaching facility that doesn't produce mindless derridian drones, hysterical nazifeminists and their ilk.

5:

@1: I think that in practice, programming requires good abstract reasoning skills -- but you can come at it from the perspective of language and lexical manipulation and grammar as well as from a formal logic/mathematics background. As I think I mentioned earlier, I didn't study mathematics to "A" level, and apart from some supplementary courses, "O" level maths is where I stopped -- but that didn't prevent me from getting a CS degree and working as a programmer. (It made the physics "A" level somewhat harder, and probably cost me a grade or two in the end, but that's another matter ...) Mind you, I got the "new" maths, in the mid-to-late seventies -- we hit Boolean algebra and set theory before we got to basic trigonometry, and did an optional course on programming (including writing BASIC programs for off-site execution) at age 13 or 14. I suppose if the basics of formal logic get drilled into you at a low enough level, the CS stuff is, if not intuitively obvious, then at least vaguely familiar and easy to pick up.

I've got no idea where fashion has taken mathematics in secondary education during the 1980s and 1990s; if they've kept on teaching set theory and logic, then it's likely that your typical arts graduate with a GCSE in maths in their background has got sufficient background (even if they don't remember it) that once they blow the dust off they can do something with it.

Where humanities students are going to suffer in dealing with computers is in the bits where, if you stick the plug in the wrong way round, the magic smoke comes out: namely, hardware.

And this assumes that what I'm talking about and what the Observer columnist is talking about when referring to "IT training" are the same thing. Back in the 90s, "computer science" meant stuff like formal logic, programming in ML, and how to design and tape out a microprocessor; whereas "Information Technology" was a euphemism "how to write a spreadsheet in Excel and get the word processor to check your spelling so that you don't look like a complete numpty". (I caricature the fields slightly, but I hope you can see the distinction ...)

user-pic
6:

I don't think it's an IT specific thing - I've noticed that graduate accounting trainees tend to be much better if they've done a degree other than the depressingly ubiquitous 'business and finance', though one shouldn't generalise.

user-pic
7:

I don't recall just when Demon was bought out by large corporate interests, but I can check.

Press release, duplicated in plain ASCII in the demon.announce newsgroup, date 1st May 1998, has Scottish Telecom paying GBP 66 million for Demon Internet

There's a noticable change in the way the newsgroup got used by Demon.

user-pic
8:

A-level maths hasn't changed much. I'm not at all sure about GCSEs. Back in my day, there was a lot of variation between examining boards, and I saw a CSE maths paper with very basic Boolean algebra questions, something which never appeared in the O-level and A-level courses I did.

9:

Ah... Spectrum SF. Charlie, that's were I first read your work: "Atrocity Archives" serialized in said mag. Doubtless you'll come to that in subsequent offerings. I was about to lament the demise of SpectrumSF, but now I'm just confused at the wonky dates/issue numbers on their website; are they alive again or not...?

user-pic
10:

Back in 1980 or 1981 we developed a hypothesis, based on personal experience, that the key training for a programmer was an interest in building stuff from Lego to an advanced age. I have yet to find a a significant set of contrary results; while I've found people who played with Lego but didn't go on to be innovative programmers (offers himself as an example) I haven't found an innovative programmer who couldn't be bought with Lego Mindstorms.

There is something in the combination of creativity, unlimited possibilities and a restricted set of tools which just seems to click. Excuse the mild punning.

11:

@1, @5:

Having followed a slightly less convoluted path to IT (I wanted to be a chef, failing that a career in the military seemed sensible and them thar computers are if nothing else fun playthings), it takes an ability for abstract reasoning, some basic logic (not ALWAYS necessary, but handy), a small smattering of maths (over and beyond the boolean logic; it's handy to have at least a vague idea how to do a proof or demonstration, especially when you end up having to do things from scratch) and patience.

The last mostly because all computer languages are very particular about what they eat and will happily barf undigested code in your face, while screaming "WAAAAAH! WAAAAAAH! WAAAAAH!", if you're lucky. If you're unlucky, they'll happily go "munch, munch" and do something that's almost completely different from what you wanted.

user-pic
12:

@9 re: Spectrum SF - yes, the Spectrum website looks weird. I think all those 2009 dates should be 2002, as per the info on this website (scroll down), and I am sure I had issues 8 and 9 years ago.

user-pic
13:

Just to say I'm really enjoying reading this "Getting there" series Charlie.

Funnily enough Scalzi's website today has an article on the 'age' of authors and getting there (to some degree) and it's really warming to know that not everyone - indeed most - aren't 20 year old supergenii who appear able to ghost write Tolkien.

Also that eventually the odd plethora of jobs I've found myself in may actually contribute to a good novel!

14:

Up until your post, Charlie, I always used to envy the politics-lite salary-heavy IT contactors I used to spy from the Dilbert cubicle-walled realm of my sad little web master's cubicle when I still had an IT day job, but I think your post's just opened my eyes!

The grass is always greener on the other side - and I definitely agree that working on-site is the demise of many a consultant/contractor. Distance = genius: familiarity = contempt.

I'm sure there must be a magazine article somewhere inside you comparing the existence of an IT consultant/contractor with the life of a Renaissance-era mercenary fighting for the Medici family in Florence, the Doge in Venice, the Sforza family in Milan etc.

All the best

Stephen

SFcrownest.com & StephenHunt.net

>>>

"He who holds his State by means of mercenary troops can never be solidly or securely seated. For such troops are disunited, ambitious, insubordinate, treacherous, insolent among friends, cowardly before foes, and without fear of God or faith with man. Whenever they are attacked defeat follows; so that in peace you are plundered by them, in war by your enemies. And this because they have no tie or motive to keep them in the field beyond their paltry pay."
-- Machiavelli’s The Prince:

15:

Nicholas at 12: Yes, you're right. In fact, looking at the cover of issue 9, I *know* I have that one (it's the one with the final part of Atrocity Archives, presumably).

*sigh* for a moment there I had hopes of it having been resurrected; there was some good stuff in there, if I remember rightly, at a time when Interzone was getting a bit too unreadably-cutting-edge for my tastes. Speaking of which, has Interzone changed much under the new management? Maybe I should renew my subs and fine out...

16:

Dave Bell @ #7:

THUS, the entity formed from Demon and Scottish Telecom, is now owned by Cable & Wireless (with some parts spun out to the new THUS).

17:

Never realised you worked with Demon, I was there from around mid-97 until late 98 myself, think I left about 6 months after the Scottish Telecom buyout. The atmosphere in Finchley wasn't that corporate at the time, though it did seem to be heading that direction, which was part of my reason for moving elsewhere.

user-pic
18:

>>>I have yet to meet an arts and humanities teaching facility that doesn't produce mindless derridian drones, hysterical nazifeminists and their ilk.

You've obviously not been trying hard enough. I'm more of a Foucault man than a Derridian, but I've known Derridians who had some interesting things to say. As for the nazifeminists, they do have nice uniforms. ;-)

Charlie: about abstract reasoning - when trying to get an undergrad to write a thesis comparing rape in the civil wars in Democratic Republic of Congo and the former Yugoslavia, I told her to think about when her maths teachers taught her about fractions, and invited her to consider what the common denominators in DRC and the former Y. might be. . . This was in Birmingham, where we'd check each others marking. The colleague I gave this students work to for double checking came back to me and said 'how did you get this level of work out of this student? I had her for one module and she was a waste of space'.

user-pic
19:

I'm not sure mathematical training is needed for programming. I was programming in my 3rd year of secondary school; now I was good at math, but there were obviously topics we hadn't covered.

Math makes a good basis for a computer scientist and helps in a certain class of algorithm creation and optimisation (thus is important to be a _good_ programmer), but I'm not convinced it's needed to understand programming.

I read/heard somewhere that one of the key identifiers as to whether someone can learn to program is whether they grok the concept of "variables". If they get this, then they can learn to program; if they struggle with that then they'll most likely fail. Mathematicians can struggle with this concept just as much as anyone else because, while the terminology used looks close to math, the nature of variables doesn't really map properly onto parameters used in equations.

20:

Stephen: My understanding is that there are three levels of abstraction required for programming:

* Variables

* Loop structures (and iteration; implicitly, recursion)

* Pointers

More folks can grasp variables than loop constructs, many folks get to loop constructs but crack their skulls over pointers. Each demands, in turn, a greater level of abstraction. Got variables, but not recursion or loops? You're not going to get far. Got flow-of-control but not pointers? Stick to Java or COBOL. Got pointers? Congratulations: you can probably understand what you're doing, rather than simply going through the motions.

21:

Charlie: I'm sure you've read Joel Spolsky's screed on 'Leaky Abstractions'. Your three levels of abstraction fit nicely into that model.

However, I'd like to propose a fourth level -- of 'un-abstraction': understanding what is happening in the hardware when your program runs.

Most programmers (even those coding in C) never reach this level, and frankly do not need to. For your average code monkey the greatest skill you can have is a deep understanding of the underlying APIs. (For example, Java and C# aren't about the languages, but the APIs.) These rich APIs are enabled by powerful hardware with layers and layers of OS support.

But a few of us have to play our tunes on instruments carved by hand from rocks, reeds, and old sticks. We work on the hardware level where *you have to know what the CPU is doing*. Where the phrase 'latching a register' has meaning. Where you care about things like bus clock versus CPU clock versus external source clock. Where debugging skills range from black art to outright necromancy.

As you alluded to earlier, no humanities course can prepare you for this level of understanding. You can, however, achieve it on your own if you are willing to work at it. You don't need a CS degree, but you do need the ability to internalize how the hardware is actually working. I suspect the latter cannot be learned.

22:

What is needed for programming though, that I never got the hang of, was boilerplate. Which mostly exists for lack of reasonable defaults.

Why the hell do I have to use obscure libraries, deal with make-files and linkers, just write and compile the 'hello world' extended edition, that can calculate the sine of a number. Compiling is an unnecessary nightmare for beginner, and so are libraries. Ok, here that's just one line, but I'd much rather be required to #exclude if that really slows down the compiler that much, than include it. If you don't like the defaults, you could probably get something like #exclude and get your standard behavior back.

Worst offender is usually graphics (which is about as tangible as you can get in the beginning), of which there is a seemingly unlimited number of mutually unintelligible implementations. Each with its unique version of as-of-yet incomprehensible gibberish that has to copied from some obscure tutorial just to put a green pixel at position x=300 y=235 of a window. At some point you find yourself with a nice little fractal generator, and the next time you start writing a program, you open your fractal generator code, throw out everything you find understandable and put some new stuff in, hoping that you didn't break any of incomprehensible boilerplate magic.

But maybe I'm just too demanding, because my first exposure to programming was the C64 and all you needed to run a program was "run". (Although Basic couldn't do graphics in a meaningful way either.) The PC I got later had no obvious way of being programmed, I slowly lost interest and Turbo Pascal at school only came much later and I continually felt like I was intentionally shut out of most of the computer with it. What? I can only use 16 colors? Using longfloat (or whatever it was) returns an error even though it's in the manual?

Sorry about the rant, but this is what held me back much more than any of the math or abstract concepts. That is, besides the use of English instead of my native German. It kind of sucks if you don't know what "input", "print" or "validate" could mean and you have to learn all that stuff by rote.

Any suggestions for a (linux) compiler suite or whatever that is beginner-friendly, with reasonable defaults and no compiler headaches are welcome.

Test: If drawing a green line can be done through something resembling

main()
{
open.window (640,480);
draw.line (20,35, 20, 175, Green);
keypressed;
close.window;
}

or

main()
{
open.window (640,480);
for (x=35, x++, 175)
{
draw.pixel (20,x,Green);
}
keypressed;
close.window;
}


and then just do something that somehow resembles what I want without me doing much more than click a button called "run program".

Please do note the lack of #include statements. Actually, it shouldn't even need the main() function unless it serves a useful purpose.

As far as I understand computers, this isn't magic.

I don't care if this means that I won't know if the background is black or white, exactly where on the screen the window opens, that can be fixed later. I just don't feel like first having to search half the internet for a graphics library that can do what I want, refer to a manual for setting up all the stuff that *has* to be set to a default value by explicitly saying so and have the compiler showing me the finger for forgetting to give the window I want to open a title bar, the color depth (which I would want to be the system default anyway) and a place to put its digital ass on my screen. Not to mention telling the compiler what to do if I click on that funny little cross in the upper right of the window. Unless, that is, I know exactly what I want it to do in this case.

As I said, sorry about the rant.

23:

Making up a town called "Dorking" and then vandalizing Wikipedia to provide an ersatz comic backstory for said pseudo-town is not funny. (To wit: "A game resembling rugby was once played here. The two sides were unlimited in number, representing the east and west of the town. ... There is now a big statue of the Dorking cockerel located on the Deepdene roundabout. ... Dorking is the administrative centre of Mole Valley District Council. ... Underneath the town lie the Dorking Caves which are open occasionally to the public.") You may think that you're being clever, but just because some of your commenters are willing to play along does not make it right. For shame.

24:

Helluva prank dating back to 2002 then.

25:

@24: The elaborateness of the gesture does not excuse it. In my view, a line has clearly been crossed.

26:

tp1024, that is actually the kind of thing Microsoft is fairly good at. (Heresy, I know.) Visual Basic, that bane of serious programmers everywhere, exists precisely because it makes it easy to do exactly the kinds of things you list above. The price is that you have to write in bastardized Basic, which eventually drives men mad.

Microsoft's C# is located somewhere in a Bermuda triangle between C, Java, and Visual Basic. It's still several times more verbose than you ask for, but it comes closer than most languages you'll see. It's available on Linux via the Mono port, because there are some fans in the Linux world and some people who are seeking some kind of unlikely detente between MS and Linux. I learned it in a day, so I think I can say it's easy to pick up for experienced C programmers.

user-pic
27:

(At the risk of veering completely off-topic:)

@19: There was an academic study about the role of variable assignment, and if this helped students understand programming better. They asked questions in this format:

a = 10
b = 20
a = b

a is equal to what? 1) a = 10, 2) a = 20, 3) a = 30

This was done during the first lecture of a programming introduction course (aimed at those who had no programming background). If I recall correctly, the students who answered correctly got higher grades at the end. But I can't find the reference to it, right now.

user-pic
28:

@22: You might want to look into Processing - http://processing.org especially if you want to work with graphics.

29:

@27:

Not quite.

The ones who had the most consistent answers had the higher grades. That is because, if you answered the equivalent of 10 in all questions, you may have answered all questions wrong, but you probably had a fixed model of how this stuff works in your mind. Ok, it was a false model, but it is much easier to correct a false model than to make one from scratch.

30:

Hmm,

Mono:

To test the most basic functionality available, copy the following code into a file called hello.cs:

using System;

public class HelloWorld
{
static public void Main ()
{
Console.WriteLine ("Hello Mono World");
}

}
---

#exclude

I'm afraid I don't know how to comment on that one. But I will try it out. :)

31:

HTML ate my brackets ...

meant to write #exclude "sarcasm"

32:

tp1024 & others:

I came up through Fortran, Cobol, assembler, C-like languages, Modula-like languages (the last two proprietary to Burroughs - now Unisys), before finally getting to Smalltalk. Aside from bits of C (a nightmare language), VisualBasic, and Java, Smalltalk was my final and favorite language for many years (I'm now retired).

Smalltalk was one of the earliest object-oriented languages. It has a simple syntax that you can learn in an hour, and a class library that takes many years to fully appreciate. Smalltalk pioneered just-in-time compilation and advanced garbage collection, things that Java is still working on.

Anybody wishing to explore Smalltalk can download a free version from Cincom (http://www.cincomsmalltalk.com). I have no association with Cincom - I just like to share my favorite language. I wish it had caught on before Java (which I do not particularly like) came out and superceded it.

33:

...if they've kept on teaching set theory and logic, then it's likely that your typical arts graduate with a GCSE in maths in their background has got sufficient background...

Most of the kids I've been working with for GCSE Maths aren't likely to turn into arts graduates (typical or otherwise) but here's what I've seen; the top set got a little set theory, but those on the foundation course (top grade achievable: C) nothing formal. I haven't seen anything boolean, but then I haven't been there for a full year to see everything on the course yet.

user-pic
34:

@5:

Well, I finished school in 2002, with an A level in Maths, an A level in Further Maths, and an AS level in Further Maths (Additional). (Plus the standard GSCE and a hybrid GCSE/AS thing in Additional Maths; we called it an AO but technically I don't think they've been called that since my parents' era.)

So far as I recall, at no point did we ever study anything resembling boolean algebra, formal logic, or any set theory beyond 'This is a Venn diagram. You can put red things in this hoop, and round things in this hoop, and look! We can overlap them!'

I did just about manage to get a CS degree despite considerable personal problems, but I would say that my previous education contributed to the sum of approximately zero. In fact, I sort of feel that in my first year of university I learned more than the previous 14 years of formal education combined...

user-pic
35:

I think my previous post sounded more negative than I intended - I didn't mean that all that maths didn't have value in itself, simply that there was nothing geared towards preparation for anything computer sciencey.

user-pic
36:

Well well well, Demon in '97-98. Not that I worked there personally, but my flatmate did and when I last saw him he still did (they moved from Finchley to a swish Docklands office sometime in the 2000s). Anyway, he was working a bizzare shift pattern for 2nd-line support, and there was certainly no "corporate atmosphere" in that department, nor when he eventually got promoted and moved to a 9-5 job. I believe he was also one of Clive's hires.

I'm a maths grad who ought to have been a CS grad, and I wound up programming for a living anyway. Logic-wise I only got set theory, in my 1st year (now "Year 7"). Everything I initially learned about computers, I learned on my own CBM64 and Amiga.

user-pic
37:

Charlie, I've been describing recipes as algorithms. So are knitting instruction: lots of 'do while' and 'repeat until' buried inside, and sometimes even in those words. (I think knitting has possibly the early typographical form of looping: 'K1, *(k1, p1) twice, yo, ssk*, k2' has two (nested) loops, just for a random example.)

user-pic
38:

Charlie: Dorking - maybe, but a good place to visit.
At the time you are writing of, a town-centre pub, the "Old House at Home" had a sign on the door: "No Dogs" - you went in, and found two large spaniels, and thought ... "ok, it's to stop dogfights...." and THEN you saw the real reason..
A HUGE, dark-tabby tomcat (called "Nelson") who OWNED the place, and would stand for non nosense on his patch.
Sadly, those humans, and Nelson, moved out about 2002.
There was also a good pub right close to the West station.

@10 NOT "Lego", please!
Useless imitation ...
MECCANO for PROPER Engineering....

"pointers"
Maybe.
Are they really necessary?
I regarded them, during my breif acquantance with C++ as a bloody nuisance, making life much more complicated than necessary.
However, I may add that I have never dealt with "graphics" (i.e. windows-type) programs.
All were maths/results/control loaded, so perhaps, you need them for other applications - called "horses for courses" I think

39:

weird to think you programmed Webshield. We used to use that where I work until I replaced it with the McAfee Secure Content Management appliances. They not only scan SMTP but also HTTP, FTP & POP3 for viruses and inappropriate content.

40:

Greg@38

"Pointers" not really necessary? Tell you what, try and create a list of large data objects. Now try and insert another object into the middle of the list. Or sort the list. Or anything else useful.

It's like saying "let's have a library, but without index cards". They're such useful things....

Anyway, back to the cargo cults. Way back in the 1980s when I did my CS degree (with all of that messy programming in pre-standard ML, natch) the department ran both "CS1" (CS101 for y'all, and a prerequisite for the CS degree) and "IS1" (Information Systems 1) which was rather cruelly described by one of the CS4 lecturers as "Sunday Supplement Computing"

41:

"(being paid by the hour — they don't notice all the hours you rack up in between jobs, chewing your fingernails and looking for work resting)"

... Keeping up to date with developments that you aren't paid to do by the job, invoicing, chasing invoices, preparing accounts, buying all your own equipment including office space, paying for your own holidays and sick time, paying your own pension, fixing your computer when it breaks (or not working while someone else fixes it, or forking out for redundant hardware - and keeping it current) ... etc

That said, I only ever had a full time employed job for 3 months and hated it. I do get tired of working alone though, would like to work in a small collective.

42:

Re: landscaping Dorking... hasn't it been done by the Germans in a somewhat alternate timeline (sadly before nuclear weapons)? :-)

http://en.wikipedia.org/wiki/The_Battle_of_Dorking

user-pic
43:

There's actually a late Victorian bunker still on the top of Box Hill (overlooking Dorking) - I can't help but wonder if it was a direct result of that story.

44:

tp1024 @24

I feel your pain. Unfortunately, interfaces are like everything else - if it's wonderfully simple, it isn't able to do everything that people want. If it's big enough for everything that people need, it's complex enough to require that you use a reference manual. The best you can hope for is a compromise.

draw.line (20,35, 20, 175, Green);

How do I draw a dotted line? What's the correct way to draw a triangle - three points, or three lines? How do I draw a circle? Is ellipse different from circle, or is circle a special case of ellipse? How do I draw a pentagon, or a hexagon - are these basic types?

An excellent writer, who I would recommend to any serious software engineer, is Les Hatton (who wrote a rather good book called "Safer C"). One of his heresies is that object-oriented programming hasn't actually improved programmer output or decreased bug densities; unfortunately for any Inquisitors, he appears to have the figures to prove the opposite...
http://www.leshatton.org/IEEE_Soft_98a.html

45:

I'm enjoying this braid of threads very much. I'd almost given up on explaining the meta-systemic lunacy of the computer/software group of industries to people from more traditional careers. Whenever I tried, they'd give me that "you're insane" look. Whether or not I am, the possibility never seems to occur to the audience that the whole cosmos that I was snapshotting, from Bill Gates to Steve Jobs and Lawrence Joseph "Larry" Ellison on down is run by brilliant but dishonest lunatics according to a set of axioms that started in the Twilight Zone and were mutated by red kryptonite aboard the Red Dwarf down the Rabbit Hole. You succeed where I failed. Bravo!

47:

Regarding maths vs. CS:

Learning to specify software in Z (all symbolic logic and set theory) changed the way I thought about programming. The fact that I never used it in a 'real' project is irrelevant.

It meant I could learn OO and UML and design patterns and the like despite my advanced years...

But, like Charlie, I'd rather write novels than code.

49:

I can definitely identify with that "writer's learning process"-the slow figuring out of how things work.

A lot of this should be standard given the size of the "how-to" industry, but somehow they never get past telling you to write and mail the query letter.

Also really interesting hearing the computer world stories, having come of age in the tech boom when this was made out to be not just the safest, most practical career track, but a sure path to fame and riches-or at least, the only job left with everything else rendered obsolete. (Not that that's all gone, of course. Sitting here with the TV on I'm being barraged by local commercials from technical schools promising wonderful high-tech careers to anyone who pays their tuition.)

50:

Pardon the interruption, but I can't resist pointing out that, according to Amazon, my copy of Wireless is on its way. Hooray!

51:

Problem solving. Programmming is all about problem solving. I think (feel free to disagree) being *delighted* every time you solve a problem is going to help you be a programmer more than anything else. Programming == Sequential Problem Solving.

Works for me anyway -- been programming since I was 12 (crap! 29 years).

Having said that, some people simply do not grok programming in any shape size or form. Their brains are wired differently and they end up working in the Marketing Dept...

user-pic
52:

Oh, Ghu, pointers.

One of my CS professors handed us the algorithm for a non-recursive Quicksort, which we had to then write the code for. In Fortran. Lots of stacks implemented as arrays, with indexing pointers. I seem to recall that that quarter also included Basic, assembler (Macro-11), and Latin, and I was learning Pascal on the side. (I dropped out about four classes short of the degree. Compiler design (requiring writing one) and differential equations were three of the classes ....)

user-pic
53:

Re: the perils of consulting -- back in the 90s -- when common knowledge said that Japan was going to conquer the computing universe -- I knew a consultant who made a fantastic living getting fired from jobs. He discovered his unique consulting niche by being flown over to Japan to "fix" a failing project (at some ungodly high hourly rate plus a big fat per diem to live on while in Tokyo). It slowly dawned on him that he wasn't supposed to fix the failing project, but to be the token gaijin who would take the blame for a project's failure. They paid him handsomely for his "failure" -- and by taking the blame, no Japanese worker or executive had to take the fall for a poorly planned and poorly executed project. His boss recommended him to other execs who had failing projects on their hands. At the time I knew him, he was flying regularly to Japan, always to drop in at the tail-end of a project -- to take the blame for poorly thought out IT initiatives. In the US he was brilliant and successful, but the Japanese paid him much more to be brilliant and a failure. He spent most of his abundant free time in Japan teaching himself Japanese -- which became his true love. There must be a story in that!

user-pic
54:

Clive Feather moved to Demon from IXI in Cambridge, where I worked as a tech author at the time you were at SCO. Though I think Clive was more often found in Watford by the time he finally left the company.

In fact you came up to Cambridge for a day in (I think) 1994 to teach me SCO's mad troff macros. I don't remember why, I never needed to write anything using them: we had our own systems for IXI's software.

55:

David: yeah, I remember -- IXI was using FrameMaker, which was infinitely more sane than what SCO was using. (But then, SCO had a gigantic quantity of legacy nroff formatted documentation by way of the UNIX lineage.) Clive was, IIRC, working in Watford from 1993 onwards, if not earlier (before SCO's takeover of IXI), although I'm not clear on the details.