(1) A lucrative use, if the "printers" can be made portable enough, will be floor- and wall-coverings. Consider the front parlour with a customized-to-its-actual-dimensions-and-shape "Persian"/"Turkish" wall-to-wall carpet. More to the point, consider this sort of thing in semiindustrial areas, like management's offices in an automobile factory. Too, we might see a return to vertical fabrics over wallpaper — again, applies on site and fitting perfectly, avoiding nasty seams. Fabrics, and particularly felt-type fabrics, do a better job of sound- and heat insulation than thin-sheet cellulose (and cellulose analogs), so that's a bonus.
The biggest problem here will be adapting the process to the substrate. These uses have one huge advantage over clothing, from the perspective of the manufacturer: Planar tensile strength is less important than in many other textiles. That's precisely why car manufacturers can use this in a car's footwell!
And as a side note, consider this for making custom furniture covers, and for tablecloths and matts in mid-range restaurants.
(2) Another type of clothing that would lend itself to this use is the company/school tie/scarf/cravat. Indeed, because their form is among the simplest (fundamentally, they're just folded sheets), they'll probably be one of the earliest "on site" uses — that is, a large company (or, as part of the building's rent, several small companies in the same location sharing a single "unit") or school will print five ties/scarves/cravats for new employees/students at orientation, actually length-sized to the employee/student.
If this sounds like some of the worst aspects of stereotypical misinformation concerning salarymen in Japan, it should...
(3) In intellectual property terms, there's a "third way" that presents some interesting problems: The design patent. Design patents are restricted to decorative, nonfunctional aspects of physical objects. They have very limited terms... but are of use in helping establish a particular design feature as a "famous mark" so that trademark law can take over when the design patent term ends. Consider a certain shoe manufacturer's three parallel oblique stripes, which began as a way to decorate a necessary seam location (on the types of shoes and with the materials used at that time) and have evolved into a trademark. In the US, those three stripes were covered by a design patent (under an older statute) when that manufacturer moved into the US market, and evolved into a general brand identification. I understand that there was a similar process in Europe for that manufacturer/brand, but this all precedes 1992 so the legal issues are... less uniform.
The key with this kind of transition is knowing in advance what aspect(s) of the design will become famous. Is it the doubled handle on a handbag? The stitching on the handle? The stitching pattern connecting the handle to the handbag? Some combination? No matter what any design guru says, the particular aspect that becomes famous can't be predicted in advance; if you need proof of that, look at a Cadbury's bar and ponder the recent rejection of its attempt to register a specific color of purple as its mark...
]]>At the battalion level — and above — measured by both (a) achieved mission objective, and (b) didn't give up that mission objective in the inevitable counterattack, my general citation is to the official Army Air Corps histories, particularly around St. Mihiel. Unfortunately, my copies are packed away at the moment so I can't be more detailed than that... except to note that the German histories I've read (again, packed away and/or still in the Pentagon libraries because I never had personal copies) made the same point. That's not to say that there were no awful American units; it's only to say that at the battalion ("operational") level and above, the incidence of "bad" battalions by 1918 was no greater than that of any other combatant — and that's primarily because of logistical concerns.
It's somewhat ironic that the American inexperience with trench warfare meant that it had not reset its entire system based on three years of static trench warfare at a time that even infantry warfare was evolving out of the trenches... and that American doctrine therefore still required mobile logistical support.
And this is definitely a tangent, except to note that once again the logistical tail is wagging the teeth of the dog, which was my main point.
]]>Individual US soldiers made lots of mistakes in 1918. Small units made lots of mistakes in 1918. Even relatively large units made lots of mistakes in 1918. And unit-for-unit, they were still more effective than anyone else... because they hadn't been bled by four years of trench warfare and therefore were not making the mistake of neglecting logistics and rear-area functions and maintenance of operational/strategic reserves in the name of getting more fodder into the front lines.
Further, the writing was more on the wall in terms of logistics than of troop strength. The US hadn't had the "opportunity" to destroy a generation of soldiers, materiel, morale, etc. in Belgium and France and Silesia. Further, a very high proportion of durables that were the foundation of economies throughout Europe were wearing out due to nonreplacement since early 1915. For example, the official US history remarked on the problems with moving supplies from French ports to the front caused not just by the different rail gauge (meaning everything had to be reloaded), but the broken-down and missing rolling stock and locomotives. The implications of that for getting food to the general population should be pretty obvious. Factory machinery was even worse off. Then there was the access-to-minerals problem; unlike every European power, all of the minerals in common industrial use in 1916-20 were available in sufficient quantity in the US, whereas no European power could obtain its industrial needs entirely from within its own 1914 borders.
Western Europe in 1918 was rather like tag-team professional wrestling (which, given the preordained "side plots", etc., is disturbingly close to what appears to have been going on): The UK, and to a lesser extent France, tagged their previously uninvolved partner the US to put a flying supplex onto the Hun... and the match ended with a fake submission hold, to be reignited in a few weeks years with lots of posturing between has-been leaders.
Autocorrect, not so much... especially since no typist outside of Word is taught to use (r) for the registered-mark symbol.
]]>That's certainly the only explanation I can come up with for some of the ... inventive ... autocorrects built into the system by default. For example, almost nobody has ever used (r) as an abbreviation for "registered trademark, and yet a lot of people have needed the p-in-a-circle for pre-1972 recorded music. Therefore, I infer that there is an eldritch distinction between the two that only Bob can trace down; whether it's necessary to autocorrect the one without the other to avoid or to force the correct representation of a name of the Old Ones...
]]>(1) Not all italics are italics. Really. For example, the italics ordinarily used for emphasis in running text do not have the same meaning as the italics ordinarily used for a foreign word/phrase in that same document, let alone certain title-of-the-reference levels in the footnotes and/or bibliography and/or running text (which may change their nature depending upon whether one is writing a legal document as a brief intended for a court or an academic document for other lawyers or an academic document for nonlawyers).
The problem here is not with the file format. It is with the user interface. There is no reason — none — that <ctrl>i could not create a style rather than a hardcode by default, with <ctrl><shift>i for the hardcode (required to, say, accurately quote an existing document that used italics). The same goes for drop-down/context menus accessed via mouse or other nonkeyboard pointer. However, no word processor does that.
(2) The arrogance of the Chicago Manual of Style (note, carefully, the hardcoded italics there) and its attempt to impose what one group of typesetters thought would make their jobs easier upon the function of everyone else in the US — which, since the 1st edition, has become a true monster — seriously infects everything (and disrespects non-US users). One of the reasons that Word does not nest footnotes is that CMS, since the second edition, has said that's not allowed. Of course, an awful lot of us who deal with footnotes don't use CMS...
The flip side of this is that other citation-and-format systems are equally arrogant, if not more so. The Blue Book (actually, that should be in small caps, not italics... but that's a CSS-level code that doesn't work in MT interfaces, because HTML — in its own bit of arrogant disregard for others — doesn't think small caps are important enough to be a default tag) makes whether there is a space after a period a matter of substance. There's a difference between "F. Supp." (correct), "F. Supp. 2d" (correct), and "F.R.D." (also correct), all three of which refer to decisions from US district courts. Actually, that should be "U.S. District Courts".
It's all well and good to say "well, just modernize all of the systems." The problem is that modernizing for the future doesn't change legacy documents. As a specific example, the Blue Book "modernized" its rules for whether one may/must close up periods in multipart abbreviations when it dropped the old rule, which was based on how many syllables were in the underlying term being abbeviated. (Yes, really — it used to be "F. R. D.", which could not be broken across lines.) The less said about all of the unnecessary periods, which lead to inconsistent word counts, the better... not to mention the continued use of abbreviations for states used nowhere except legal citations since the 1960s, and which cause nothing but confusion...
So, much as I despise the way Word (not to mention most other word processors — WordPerfect was better than most, but far from perfect) has implemented some of these systems, any implementation was going to resemble verb forms in German (lots of irregular verbs, including some of the most common). It's too bad that Word focused its user-interface efforts on assuming user stupidity and hung its file paradigms off the user interface, rather than the other way around... but that would just present us with a different set of kludges to worry about.
(3) Non-US-typewriter characters. Proclaiming that the s-zed can be represented by ss (which might get split by a hyphenation algorithm) does not make it so, particularly not when trying to accurately quote another document. And sometimes it's wrong. However, if I try to code that into my default keyboard, I have to give up access to another command... and can't access it consistently across programs or documents anyway. And that's just one character; I'll save my rant on diacriticals (and their effect on spellcheckers) for another time...
(4) A pox on all typography gurus who aren't active writers in the fields for which they are proclaiming "correct" or "best" practices. I don't have the option of varying line spacing for better readability in documents submitted to California courts (25 evenly-spaced, numbered-in-the-margin lines per page), so the received-wisdom ideal typeface that accentuates superscript placement and minimizes subscript placement won't work for me. So does received-wisdom advice on changing line length to match some purported ideal (which is inaccurate for those of us who wear glasses, whether reading on paper or on screen, in any event). Conversely, if I'm quoting that same document in a different one (a court document, an article, a letter to a legislator, whatever), I have to strip all of the court-mandated formatting... and received-wisdom advice is then irrelevant.
]]>The world's default driving methods will become those of San Jose, California... because that's where the closest-to-actually-commercially-ready systems are being developed. That means:
complete inability to handle roundabouts/traffic circles without pausing at the entrance
no limit to the number of lanes one's vehicle crosses on the way to the exit that one didn't plan on taking
the less said about merging into traffic, the better
things are even more interesting on the left side of the road; it's actually a lot harder to "reprogram" across the centerline of perception than it is for people to adapt (it's about sightlines, too)
On the other hand, that sure beats Riyadh or Madrid or Tokyo. (Or Manchester, but that's another story entirely.)
]]>Sadly, the whole "just war" issue (de Groot aka Grotius is just the tip of the iceberg) is a classic example of post hoc rationalization, in which both history and legal doctrine get written by the winners. The history behind what exactly is included in the Hague Convention and the Second Geneva Convention is rather horrifying. Now stir in the identity of accessions to those treaties — and, disturbingly, imputed accessions — and things get really... interesting. ("Interesting" in the sense of "I'm stuck in this graduate seminar, and a fellow student has just demonstrated a disturbing command of clever rhetoric to make a point not worth making but that might impress either the professor or that attractive fellow student over in the corner, and the I'm too polite to call it 'utter bullsh*t'.")
]]>Not to be overly pedantic, but there's a big difference between "squaddie knows how to do" and "command structure provides authority to do." Until you've been on PRP yourself, and/or had to certify people as PRP-ready, you've got no idea just how the "authority to use" issues change the way that training has to be done for seemingly the simplest of tasks. When "ensure that steps 17 through 34 on page 3 of the manual have been followed to arm the chemical-munitions payload" has to be added to the 155mm NATO-standard-howitzer training, things get... interesting. Especially with squaddies who aren't used to reading long checklists as part of "routine" reloading!
@173:
That might make resupply of the bunkers, and maintenance of whatever weapons systems are in the bunkers (it ain't just M16s), easier... especially in foul weather. This was a standard requirement long before the ADA made it mandatory; the ADA and implementing regulations are just a bit more particular about things like the permissible slope, etc.
]]>The insane cannot be deterred. They are, precisely because they are not rational, impossible to model in game theory. In short, dealing with a madman bears little or no relationship to one's own arsenal; it bears little or no relationship to one's own willingness to use what is in one's own arsenal. And that goes for a wide variety of "insanity."
The implication of this understanding is that holding a nuclear arsenal will not deter a madman, whether that's a recognized head of state like the current NK leader or a non-state actor like bin Laden or a theocrat like whoever is actually running things in Iran (and it's frightening how uncertain that is... I used to think I knew, and it was my job to know).
Nuclear deterrence is a game-theoretical construct that only works when all players in the game are rational. And that's true for any n>1 in the n-player game; even one irrational player, umm, blows it up.
Whether some of our recent-past Anglo-American leaders qualify as entirely rational, at least for this purpose, is disturbing. The less said about the leadership of other acknowledged nuclear powers, the better.
]]>Determine which areas the military issues both tropical and arctic/subarctic clothing to enlisted personnel. (That's "issues as part of the basic uniform set," not "makes available for personal selection.") Fifteen years ago — the last time it really mattered to me! — that meant St. Louis and DC. Everybody else had only "standard weather" gear plus either topical or arctic/subarctic (and for many areas, neither). Of course, as an officer I wasn't being issued any of it (I was expected to purchase it)... but it was always amusing seeing jungle boots being issued along with fur-lined mukluks.
]]>Tokyo.
Really. Or, at least, by analogy "really."
Perhaps a little bit of a current US bankruptcy will help explain things a bit. As readers of this blog are probably aware, Borders — the No 2 US trade bookstore chain — declared bankruptcy earlier this year and is being liquidated. Although the final figures are not in, the preliminary distribution figures are both disturbing and fascinating. * In the real world, all creditors publishers are going to get between 10% and 11% of their claims. * If one removes the forward-looking landlord claims (that is, if there are fifteen years remaining on the lease, the claim is for that entire fifteen years of rent, under the presumption that the landlord will be unable to release the property), that figure balloons to about 32%. * If one goes further back and sets back rents back to rents at the time given stores were initially located, that figure goes well over 40%... and allows enough cashflow that bankruptcy would not have been necessary in the first place.
The "Tokyo" problem is the linkage between rents (using the term in all three of its economic senses) and the presumption of steadily increasing land values for land that is not changing hands. In the instance of the High Street, most of the underlying freeholds change hands a couple of times a century or so, but the rents are constantly increasing despite the constant cost basis. The reference to "Tokyo" concerns the 1980s and 1990s real estate bubble in Tokyo; in 1989, for example, if one deconflicted the accounting, the approximately 40km2 of central Tokyo was valued at more than the nonpublic lands of the state of California... and Oregon. This, in turn, was used to "secure" financing for interlocked-directorate corporations.
My point is that there seems to be an expectation of constantly increasing returns from real property that is not supported by actual constantly increasing cost basis on those real property investments — and high-street rents for shoppes are just one reflection of that. This was a particular problem for Borders because the corporation leased only the land itself, and not the building (well, it's more complicated than that, but that's the basic model), so the landlord couldn't even claim the necessity of repairs to the building as a steadily increasing cost basis. I hesitate to think what Hatchard's in Ipswich might be like today; the news that Foyle's is leaving its iconic location in London for a new, purpose-built building just reinforces the issue.
The less said about the historical relationship of leaseholds, property values, entailment, and seisin, the better... but then, I actually lived on a frankelmoyne (lands appurtenant to a historic abbey) for a while when I live in Blighty, so I had incentive to start understanding the estates in land (and their economic expectations and basis) long before law school.
]]>Broadband 45d/5u will be $14.95US a month starting in February, through the city. That is, four months before I move 3000km...
]]>