March 2013 Archives

While Charlie's away, Joan and Stina appreciate the chance to fill in. Picking up on Elizabeth's thread:

So where "on the map" are women in science and science fiction?

A startling study in PNAS tests how male and female lab directors hire a lab manager. Both male and female scientists are more likely to hire a male than a female. "Analyses indicated that the female student was less likely to be hired because she was viewed as less competent" -- although the study design had presented candidates with identical qualifications. Similar studies show that both male and female reviewers are more likely to review favorably a paper by a male author. And a recent issue of Nature reviews the whole disturbing picture.

The dearth of women in engineering and Silicon Valley is no surprise. And in related technical fields, the climate remains harsh. Stina recalls how she worked at Motorola as a graphic designer. "My first experience with gender bias was while working at Motorola. They paid me $48k per year and made out like they were verging on overpaying. (This, with six or seven years experience working for Motorola as an illustrator/designer.) Mind you, I'd designed the covers for their award-winning PowerPC user manuals. Eventually, I got a $2,000 merit raise. They then promptly hired a male graphic designer fresh out of college for $65k. I was not happy when I found out. I'll add that at that time there was an older male manager known for sexually harassing the female staff--an incident in a cab was particularly frightening. There were enough complaints documented that rumors of a lawsuit surfaced. That was when personnel took notice, and I was interviewed. (I’d had my own incident with the manager in question.) In the end, personnel chose to protect the manager. He got a transfer to another department."

But the findings reported by PNAS and Nature are more surprising because they focus on the biological sciences, where women arguably seem to
have made the greatest strides. In biology classes of undergraduate and
graduate schools, women now often outnumber men. A student of Joan's recently interviewed at a top grad school where he was the only male candidate, with a dozen females. But who are their professors? There, it's a different story.

Remember that it's been barely a generation since women in most Western cultures, educated or not, were expected to stay home. Those of us who first pursued careers had to fill the roles that male communities assigned. Thirty years ago, Joan recalls sitting in the office of a female mentor who had done her graduate work with James Watson, and who lacked confidence in young Joan. When Joan burst into tears, her mentor paused and reflected, "That happened to me in Watson's office." The mentor--like her own former mentor, Watson--had been trained to devalue her female students.

It's tempting to think, "We've Outgrown All That." There are indeed many mentors, including males, who effectively support women students
and writers. And there always have been--John Bernal, for instance, a contemporary of Watson's who effectively mentored several female crystallographers, including Nobel-winner Dorothy Hodgkin. Today, perhaps it's too easy to say "We support women," or "We interview women" (though they don't quite get the job.) The PNAS and Nature stats cannot be written off. Is it possible we've taken our modernity for granted? Are our modern biases so skillfully hidden from ourselves that we don't even see?

Try this test and find out. Do you personally associate science with male more than female? What about race, ethnicity, orientation? Most of us who take this test find that our minds still deeply associate male/science and female/family. Is it reasonable to think such associations don't affect our professional judgments?


The gender bias exists in fiction too. It's well known that reviews are what give authors a leg up in the literary world. However, male authors are statistically more likely to get coverage than female authors by a large margin. According to data collected by VIDA (an organization for Women in the Literary Arts):"The New York Review of Books (89 reviews of female authors in 2012 to 316 of male authors), the London Review of Books (74 female authors to 203 male) and the Times Literary Supplement (314 female authors to 924 male authors) fared especially ill."

If you think that SF and F are free of this bias, think again. The cover of our own SFWA Bulletin (the 2013 Spring issue that just arrived) only lists articles by male authors. Meanwhile inside the magazine, there are no less than forty-three cover images of novels by male authors as opposed to five by female authors. (One of those five was clipped in such a way as the name of the author was removed from the image.) In the Fall of 2012 issue, the numbers were male: twenty, female: twelve--and that was in an issue that highlighted female SFF authors. Reviewers of the women-centered anthology The Other Half of the Sky praise the "female protagonists ... just as incredible and compelling as their male counterparts." How long will we have to go on proving this point?

These stats are of particular interest for those of us in research science, where objectivity is the coin of our realm; and in science fiction, which claims to reach beyond “mundane” assumptions. Overcoming gender bias, and assumptions in other dimensions, can only lead to more creative science and fiction.

So what can we do about it?

(Note for newcomers: Stina Leicht is the author of two novels published by Nightshade Books, Of Blood and Honey, And Blue Skies from Pain and has a new short story appearing in the anthology Rayguns Over Texas this fall. Joan Slonczewski authors the Frontera series and conducts microbiology research funded by the US National Science Foundation.)

You might have noticed I'm not blogging right now, but Joan Slonczewski and Elizabeth Bear are.

That's because I'm in Perth, Western Australia, at an SF convention. And after it's over, I'm having a vacation — one of those weird hermetic retreats normal people go on, during which they refrain from working. I'll be back in late April. In the meantime, it's guest blogger time.

(As it happens, I do have an announcement to make next week, once I'm out from under an NDA. But apart from that, consider me AFK — Away From Keyboard.)

You probably think you know what a nuclear explosion sounds like.

You're probably wrong.

The first footage released of hydrogen bomb tests was silent. A foley was dubbed in, using a standard explosion or cannon sound effect repeated to form the familiar continuous, ominous rumble. (If you think about this, it's pretty obvious that the footage most of us are used to is dubbed, because audio and visual are simultaneous--and these films are shot from miles away from the blast site.)

Here's what it really sounds like.

(Blast begins at 2:20, sound arrives at 2:54)

...a gigantic slam. And then a rumble, but a lighter one than we're used to hearing.

Now, I don't think was done to fool anyone. I think it was done because, as film makers from Roddenberry to Lucas have discovered, a silent explosion lacks a certain visceral punch for most people.

To fix that, they used what they had to hand.

The point here, inasmuch as I have one, is that the media we consume produces our map of the world. We process our understanding of reality through those filters: the human brain deals with a world of unrelenting complexity by finding patterns and filtering out input deemed to be irrelevant. Our bodies are optimized for this process, in fact: thus, as opportunistic omnivores, we readily taste salt, sugar, protein, acid, possibly fat--and certain classes of toxins!--but cats and chickens cannot taste sugar. (Some cats may have a limited ability to do so.) Cats, however, appear to be able to taste adenosine triphosphate: they're obligate carnivores, and that is the taste of meat.

Dogs are better at tasting and digesting starches than their wild wolf ancestors: they have adapted to a life on humanity's midden heaps. Bees sense magnetic fields and the ultraviolet colors on a flower petal that seems plain white or blue to human eyes.

Alien perceptions, in other words, necessarily produce an alien map of the world. And manipulated perceptions produce a manipulated map of the world.

And to further complicate the matter, one's acculturation strongly affects how one processes--filters--information, and what patterns one finds there.

All our maps are of necessity flawed. We can't see through our friends, as a dolphin--a living echogram machine--can. We can't smell incipient cancer or a seizure about to happen, but our family dog can. And we can't correct for every bit of spin--intentional, careless, or just necessitated by human limitations--that creeps into our information flow.

We can, however, be aware of a truism coined by a subject of one of the world's great police states, where spin and message control was a way of life--and for many, of death: that there are more things in Heaven and Earth than are dreamt of in our philosophies.

So the Salt Being is back. Bwa-ha-ha! More microbiology.

As Charlie points out, there are lots of ifs and buts about the coming singularity, the day when machine intelligence finally overtakes the human mind. But what if the singularity is already underway? And if it is--what does it look like?

Suppose it looks like mitochondria. Suppose we're becoming the mitochondria of our machines.

How did mitochondria get to what they are today? The (now classic) theory of endosymbiosis began as a New-age feminist plot by Lynn Margulis, a microscopist known for setting paramecium videos to rock music. Around one or two billion years ago, a bacterium much like Escherichia coli took up residence within a larger host microbe. Either the larger tried to eat the smaller (like amebas do), or the smaller tried to parasitize the larger (like tuberculosis bacteria do). One way or another, their microbial descendants reached a balance, where the smaller bacterium was giving something useful to the host, and vice versa. In fact, this sort of thing happens all the time today. If you coculture E. coli with amebas, an occasional ameba will evolve with bacteria perpetually inside--and the evolved bacteria can no longer grow outside. They are slipping down the evolutionary slide through endosymbiosis, to eventual become an organelle.

But the price of endosymbiosis is evolutionary degeneration. Genetically, the mitochondrion has lost all but a handful of its 4,000-odd bacterial genes, down to 37 in humans. Most of these genes conduct respiration (obtaining energy to make ATP). From the standpoint of existence as an organism, that seems pathetic. The mitochondrion is a ghost of its former identity.

But is it so simple? Did mitochondria really stay around just for that one function? If that’s all the genes that are left, then how do mitochondria contribute to tissue-specific processes such as apoptosis (programmed cell death), production of oxygen radicals, and even making hormones?

Surprise--about 1,500 of those former mito genes are alive and well in the nuclear chromosomes. How did the genes get there? First, mitochondrial DNA replication is error-prone; errors accumulate there much faster than in the nuclear DNA. Second, DNA replication often duplicates genes--the leading way to evolve new functions. Suppose a duplicated gene ends up in the nucleus. It will stay there, while the mitochondrial original decays by mutation. Thus, over many generations, the mitochondria outsource their genes to the nucleus.

Is this starting to sound familiar? As Adam Gopnik writes, "We have been outsourcing our intelligence, and our humanity, to machines for centuries." Long ago, since Adam and Eve put on clothes (arguably the first technology) we have manipulated parts of our environment to do things our bodies now don't have to do (like grow thick fur). We invented writing, printing and computers to store our memories. Most of us can no longer recall a seven-digit number long enough to punch it into a phone. Now we invent computers to beat us at chess and Jeopardy, and baby-seal robots to treat hospital patients.

As we invent each new computer task, we define it away as not "really" human. Memory used to be the mark of intelligence--before computers were invented. Now it's just mechanical--but as Foer notes in Moonwalking with Einstein, memory is closely tied to imagination. Once we can no longer remember, how shall we imagine? And if all our empathy is outsourced to dementia-caring robots that look and sound like baby seals, what will be left for us to feel? Poetry and music--don't mention it, computers already compose works that you can’t distinguish from human.

Yet we humans still turn the machines on and off (well... sometimes). The machines aren't actually replacing us, so much as extending us. That's the world of my Frontera series. Humans still program the robots and shape the 4D virtual/real worlds we inhabit. But those worlds now shape us in turn. Small children exhibit new reflexes--instead of hugging their toys, they poke and expect a response.

The real question is, what will be the essential human thing left that we contribute to the machines we inhabit? Will we look like the "brainship" of Anne McCaffrey's The Ship who Sang--or more like the energy source of the Matrix? Mitochondria-hosting cells ushered in an extraordinary future of multicellular life forms, never possible before. Human-hosting machines may create an even more amazing future world. But if so, what essential contribution will remain human?

(I just felt the need to lift a comment I posted on an earlier thread up here where it belongs.)

Quoth a commenter, to whom I felt the need to reply:

Things change. Technology accelerates it. The only thing up for debate is the timing.

This is a statement of ideology, not of fact.

For most of the duration of the human species, change has not been an overriding influence on our lives. In fact, it's only since roughly 1800 that you couldn't live your entire life using only knowledge and practices known to your mother and father. We are undeniably living through the era of the Great Acceleration; but it's probably[*] a sigmoid curve, and we may already be past the steepest part of it.

In the previous thread, one of the commenters noted: Of course, OGH has previously estimated that disaggregating the publisher's job would land him with 0.5-equivalent of management work, leaving us (and him) with only 0.5-equivalent of an author. That doesn't mean it can't be done, but there'd have to be a pretty good reason...

Let me expand on that, in case anyone isn't convinced.

(Caution: here lies crazy speculation. For a backgrounder, the casual reader should probably read my Common Misconceptions about Publishing series of essays; otherwise you're going to fundamentally misapprehend what I'm talking about.)

Trade fiction publishing is a supply chain business. At the back end, out of sight, a trade fic publisher takes raw inputs from a large number of small businesses (mostly sole traders). It transforms these inputs, packages them, and then—at the other end of the business—distributes them via wholesale and retail channels. You or I then buy the products, which are micro-branded and highly idiosyncratic. The author is the micro-brand; despite centuries of striving there are few sub-sectors of trade fic publishing where a reader might go to a store and buy half a kilo of a particular publisher's product range without reference to the authorial brand.

Like all supply chain businesses, trade fiction publishing is dominated by contracts—contracts with suppliers (such as authors, copy editors, typesetting bureaux, print shops, cover artists), and contracts with customers. (You and I are not these customers: I'm talking about Amazon, Barnes and Noble, Waterstones, et al.)

These contracts lock in certain types of business practice. And the first contracts in the chain are author/publisher contracts. And so it occurs to me to ask: what new business models might be possible if author/publisher contracts were drafted differently?

We get book covers. We even get new book covers! Lots of new book covers! Which gives us a chance both to show off new books and to examine just how different publishers in different markets approach the art of cover design.

First up is the UK paperback release of The Rapture of the Nerds, by myself and Cory Doctorow. Lovingly produced by Titan Books, they've gone and excelled themselves with this great cover:

Nerds cover

Second up, and exemplifying the way marketing strategies differ between the USA and the UK, are two covers for the same book — Neptune's Brood (coming on July 2nd, 2013; pre-order by following that link). Compare and contrast Orbit's kick-ass watery space opera (left) with the alternative design values implicit in Ace's US cover (right):

cover shots (Neptune's Brood (UK)) cover shots (Neptune's Brood (US))

Third and finally, Orbit have redesigned and are reissuing the earlier Laundry novels, with an all-new cover design to give the series its own distinctive visual identity:

Atrocity Archives new release Jennifer Morgue new release
Fuller Memorandum Apocalypse Codex

What do you think?

I am not amused. Yet another example of the impermanence of cloud services (especially services that you don't, ahem, pay money to receive).

What are the likely consequences if, after the next election (in 2015), Britain votes in a referendum to leave the EU? (As 53% of UK voters apparently desire ...)

I will be in Peters Brauhaus tonight, from 8pm; all welcome. (Cologne Pub Guide entry.)

I get to give talks. Here's one I gave to a seminar of engineering students at Olin College (south of Boston) last month, on the next thirty years:

On Tuesday I am heading out the door to Dortmund, for DORT.con. Between now and then I am embarked on a death march to the end of the edits on "The Rhesus Chart". If your guess is that this means I'm going to be scarce around here for a week or so, your guess would be right.

In other news, we're closing in on the deadline for Hugo award nominations—it closes on March 10th. If you're eligible to nominate works for the awards, I urge you to do so; all too often the Hugo shortlist hinges on a lamentably small number of ballots. (In case you were wondering, I have three eligible works this year: the short story A Tall Tail (on Tor.com), The Rapture of the Nerds (novel, Tor, co-written with Cory Doctorow—free download via that link), and The Apocalypse Codex (novel, published by Ace)).

Finally, back in the real world, it looks like the second Dragon ISS resupply mission has docked with the space station, and Denis Tito's team has devised an innovative solution to the cosmic radiation shielding problem...

Specials

Merchandise

About this Archive

This page is an archive of entries from March 2013 listed from newest to oldest.

February 2013 is the previous archive.

April 2013 is the next archive.

Find recent content on the main index or look in the archives to find all content.

Search this blog

Propaganda