March 2002 Column


[ Site Index] [ Linux Index] [ Feedback ]


Usability

"Private Eye" used to run a column titled "Pissed Old Hack Baffled by New Technology", documenting the collision between the lead type afficionados of Fleet Street and the invading cyborgs from Planet Murdoch who replaced all their precious manual typewriters with DTP systems.

This is the March issue of Shopper, but editorial lead times being what they are, I'm writing this column in December in the run up to Newtonmass -- the anniversary of Sir Isaac Newton's birthday, which fell on the 25th of December, 1642 -- and I've been drinking in seasonal quantities. Possibly as a consequence of the bibulous festivities I'm having trouble with some new (to me) technology. I leave my age up to your imagination: let's just say that I predate the VCR and I'm beginning to go grey.

Because it's Newtonmass next week, I've been buying new consumer toys to stack under the gently moulting pine tree. Being a sad bastard (see previous columns, any one at random will do), these are mostly electronic widgets with so many interfaces and controls that the paperwork is fatter than the pilot's manual of a Cessna 126. How the hell this stuff ever reached market is beyond me.

(Yes, this is the Linux column. And I'm going to get round to Linux eventually. Carry on reading ...)

Once upon a time I went for about four years without a television set. I've never watched much TV, but once I surrendered to the Borg I felt obliged to buy a video recorder so that I could tape programs that were on while I was out and watch them later. (I never got round to them, oddly enough.) A couple of years later, I moved to a location with really bad TV reception and tall buildings on every side; and so I acquired a cable TV box as soon as they were available. Yesterday I succumbed to another consumerist lure. In Shopper 164 I rambled on a bit about playing DVD movies on your Linux system. This is all very well, but laptop screens are a bit too small and sitting in front of a desktop PC is not my idea of a comfortable night watching the box. So I finally succumbed to another temptation and bought a cheap multi-region DVD player.

Television was invented around 1924 by a Scot, John Logie Baird. Baird's TV system relied on an arrangement of rapidly rotating disks with slots in them. A beam of light, the intensity of which fluctuated with the signal strength of an AM long-wave broadcast, was scanned across the back of a projection screen by these disks. User interface? Don't be silly! Even the subsequent all-electronic cathode ray tube based TV systems didn't have much in the way of controls. You had an aerial input that received a signal, and tuning controls so that you could pick a frequency to receive. You then had some basic controls for adjusting the picture, or rather, for directly controlling the horizontal and vertical sync frequency, deflection (via the current in the X and Y coils), picture intensity (voltage feeding the cathode) and so on. There was, in other words, a direct physical parameter you could adjust hiding right behind each of the controls.

Before about 1950, if you wanted to record a TV program you were stuffed: all you could do was point a film camera at the screen. But it was clear that the new field of magnetic audio recording could help. In 1951, Ampex corporation researchers first captured the VHF broadcast signal on tape, and in 1956 they put their first video tape recorder on sale -- at the then eye-watering price of $50,000.

Both TV and VTR technologies evolved linearly until the late 1970's. They got smaller and more efficient, added colour and UHF frequencies, and the tapes came in paperback-sized casettes -- but they were essentially both lossy analog technologies, and the signal processing tasks of a 1978 Sony Betamax recorder (if not the integrated circuits doing the job) would have been comprehensible to an Ampex engineer in 1956.

But something weird happened during the 1980's and 1990's. Analog video ran up against a new contender: digital video. Instead of a modulated analog signal, the digital systems relied on a synchronous bit stream that could be compressed and decompressed, allowing far more data to be transmitted in a given frequency channel -- and, with the addition of check data, received with fewer errors. Furthermore, when you break up the digital data stream into labelled packets you can route them efficiently over a computer network, reassembling the stream wherever it's needed and multiplexing streams in a single channel.

Now, here's an interesting point about digital video: because you're no longer passing a simple analog stream from receiver to display device, or from receiver to recorder, you are susceptible to all the routing headaches that afflict computer networks. You can have multiple data sources, multiple destinations -- and multiple headaches, as a result.

My first TV was straightforward: it contained the display and the TV receiver in one box. You plugged an aerial into the receiver and whatever it receives showed up on the screen.

Then I added a VCR -- but the VCR cheated: it also contained a tunable receiver. I shoved it under the TV, plugged the aerial into the VCR's tuner, and plugged the VCR's analog output lead into the TV's input. A bit of fiddling to tune the TV to whatever the VCR was pumping out on its own channel, and I had a working system.

The first inkling that the user interface for consumer electronics had been overtaken by the capabilities of the technology came when I added the cable box. I did the obvious thing, and daisy-chained the cable TV decoder in series with the video, so the output from the cable box went into the video, and the output from the video went to the TV.

I should have realised things were going to get hairy when the new cable TV box arrived. Consumer gadgets that have RS-232 interfaces are bad enough; when they sprout ethernet as well, you just know that murky software configuration problems are going to creep into the picture sooner or later. (Please note that I'm a coward; I haven't taken a port scanner to my cable box, or attempted to install Linux on it. That's not what this rant is about.)

My new arrangement worked, but was not entirely satisfactory: the main reason was that it was impossible to use the VCR to record a cable TV program while watching another cable TV channel. From being a data source and a logging device, the VCR had been demoted to a logging device. There were also picture and sound quality issues: feeding an analog signal to the VCR, and another analog output to the TV, introduced noise into the circuit. So I switched to the industry standard SCART interface, a consumer-oriented standard for combining video and sound channels in a single cable. (Consumer oriented in this context appears to mean that, unlike SCSI, it won't blow your TV up if you plug it in while the thing's running. It doesn't say anything about user-friendliness ...)

The problem with SCART is that it's designed to hook two devices up -- say, a video and a TV. The VCR, being well-designed, has two SCART sockets, an input socket (for cable or satellite TV decoders), and an output socket (to go to the TV set). The TV, being old and cheap, has but a single SCART socket. The cable TV box has two SCART sockets -- one for a VCR and one for a TV -- and the new DVD player has one (for output only, it not being a recorder). SCART is supposedly bidirectional, but that assumes you've got equipment that's wired up properly -- because the standard was developed by committee and several manufacturers jumped the gun, not all SCART connections work the way they're supposed to.

I'm not going to bore you with the details here: let's just say that it took us (myself and a lunatic with a background in system administration who's used to cabling up entire companies) four hours to get the tottering stack of boxes under my TV working. And it still requires weird ritual button-pushing: to watch a DVD I have to turn the TV on, turn the DVD player on, then turn the cable decoder on (to kick a signal out over the SCART connector) and and off again (to leave the connection free for the DVD player's signal).

See, by buying the DVD player I'd gone from having a single data source (the cable TV decoder, or maybe a rooftop antenna) to having two data sources. And SCART is designed to connect peripherals together like a UNIX pipe. SCART isn't a networking socket and doesn't know anything about routing, and if I turn the VCR on and hit 'play' it boots the DVD off the cable box (which is acting as a kind of brain-dead router) until I turn everything off and on again. The whole thing is crying out for a switchbox, and meanwhile the pile of remotes is getting to the stage where I'm thinking about wiring up a common power supply and duct-taping them to a board. It looks like the flight engineer's panel in Concorde's cockpit.

Now, on a personal note this is entirely my fault for being a cheapskate. The TV is seven years old and will be forced to keep staggering along until it dies; I'm not a home cinema enthusiast. But on a different level, this is symptomatic of what happens when a consumer technology is overstretched by consumer expectations. Daisychaining SCART devices like a SCSI chain is just about okay, but trying to twist it into a USB-like tree requires either special hardware or a special kind of insanity.

It's the same as the problem of UNIX pipes, metadata, and the problem of making objects internal to one process accessible to another.

In the beginning, UNIX had a really neat mechanism for taking data from one program and passing it to another: the pipe. UNIX programs were simple command-line tools that did one job, and did it well. Each of them had an input, and an output (except for those programs that didn't, but let's not muddy the water here). They were intended to process streams of bytes, including text files that were simply streams of bytes with newline characters embedded in them at odd intervals.

The problem with exchanging data over a pipe is one of control. How do you know when a data record begins or ends? What if the data is structured? UNIX pipes couldn't answer this problem because they were agnostic about content. Sometimes this is fine -- if all you want to do is move data in a straight line it's great -- but sometimes it's a royal pain in the neck. For example, take the humble pattern-matching tool grep. Grep expects a stream of data to scan (text files, usually) for a pattern. It emits as output either the matches, or everything except the matches. But because human beings use computers, grep is often used to search files for things that people have forgotten. And rather than grepping on each file separately, people like to say 'grep some_pattern_goes_here *.txt' inside a directory and see a nice list of matching patterns ... and the filenames that contain the matches. Filenames? That's metadata -- something other than the data stream, in this case the location of the data source in the filesystem.

SCART cables don't do metadata. All they do is shunt RGB video signals and stereo sound along. There's a line that indicates to a SCART device "hey, I've got an incoming signal", but no way of signalling the end of data (which is why I keep having to switch the damned video stack on and off).

Similarly, UNIX pipes don't do metadata. Or rather, they do it really, really, badly. If you want to use the output from that 'grep foo *' command, you need to run it through 'cut' to separate the filenames from the matched lines, and throw half of it away. If you want to do something complex -- like use both the filenames and the matched expressions -- you have to write a program in awk or perl; the native UNIX tools simply aren't powerful enough, or expressive enough, to cope with data and metadata combined in a single channel.

The weakness of pipes (and their network-aware variant, sockets) as an abstraction for inter-process communication gave rise during the 1980's to the idea of ORBs -- Object Request Brokers. An ORB is like a SCART switchbox for pipes -- only one that also has a magic remote-control interface to the devices that feed it data, so it can start and stop connections at will. A program that needs to communicate complex structured data with another program can expose a description of the data structure, and a bunch of public methods that can be used for manipulating the data, to the ORB. Other programs can connect to the ORB and issue requests that the ORB then feeds to the data source. In effect, it's a universal translator (and network-aware, at that, for ORBs provide access to objects remotely). There's a standard way of working with ORBs -- CORBA -- and a few variants; Microsoft's ActiveX is simply a re-branded (and insecure) network-aware version of OLE, Object Linking and Embedding, which was Microsoft's own attempt at implementing an ORB back in the pre-TCP/IP early 1990's.

Using an ORB, unlike a pipe, requires serious programming muscle. You need to specify the parameters your program is going to manipulate using IDL (interface definition language), write classes that implement the back-end, compile everything, then try connecting to the ORB. There's not -- yet -- anything quite like a shell programming environment for ORBs, unless you count applications that expose their public interface in the form of HTML via the web. Zope is essentially a special-purpose ORB for Python and Perl scripts, that is accessible via the web (or via FTP, or even a shell-type interface); IBM WebSphere Application Server is an ORB that provides access to a range of back-end objects by way of HTML and the web. However, although ORBs are hard to use directly, they're too useful to ignore: you can use an ORB to implement clustered applications, farming out requests to whichever back-end server is least heavily loaded as they come in. They're also essential for the newer desktop environments. While KDE 2.2.1 doesn't use an ORB (it uses a lighter IPC mechanism called DCOP) it was originally designed with an ORB in mind, and KDE 3 will move in that direction. The competing GNOME desktop is built around an ORB, ORBit; ORBit passes objects between GNOME applications that use the Bonobo framework, a set of CORBA bindings that provide mechanisms for creating reusable objects.

If the future of desktop software development on Linux entails a move away from pipes and simple metadata-unaware IPC mechanisms, towards the complexities of ORBs and component-based software, what about the future of my television and the tottering heap of boxes piled up under it?

There are several lessons to be derived from this story. First and foremost, those carping critics who insist that Linux is too complex to deploy on the desktop obviously haven't wired up a home entertainment system lately. The complexity of systems that consumers will put up with is directly proportional to their perceived usefulness. If consumers need to grapple with SCART connections in order to watch TV, then by golly they will grapple! The perceived usefulness of a home PC is lower than than of a television -- it's generally used as a glass typewriter, a teletext replacement, a games console, and little more, which is where the reluctance to invest time and energy comes from. None of these are essential functions, so why spend days agonizing over them?

A second lesson is that standards that confuse control channels with data channels get extremely messy once the number of data sources exceeds two -- as it did when I added a DVD player to a cable decoder and a VCR. The TV manufacturers made a critical mistake in the 1980's by not future- proofing their design for an interchange connector: but because they make their money by selling us new devices there's no loss to be made from this. Industries that serve consumers have no interest in making their products long-lived. If you want long-lived solutions, you need to go pester the engineers who have to maintain them in the long term.

A third lesson is that network bandwidth is going to kill television. My cable TV box coexists with a cable modem. If the TV channels coming into it were all streamed via multicast UDP, compressed using DivX or MPEG4, not only would that make more efficient use of the bandwidth coming into my house: it would make it far easier for me to watch TV. Stream it onto your hard disk to buffer it up for later, or forward it to your laptop via an 802.11 wireless ethernet link: it makes more sense than this mass of conflicting black-box standards that we currently put up with, doesn't it? The barriers to such a media solution are political rather than technological at this point -- there's no room to reiterate the whole tiresome copy-protection/prevention, RIAA, "consumers can't be trusted!" flame-fest, but suffice to say that the content distributors don't want to see a world in which data is fungible.

So it seems like we're going to be stuck with these SCART cables for a while longer, and I should have stuck to watching DVD's on my Linux laptop with DeCSS.

Happy video hacking!


[ Site Index] [ Linux Index] [ Feedback ]