Alternative Operating Systems


[ Site Index] [ Journalism Index] [ Feedback ]


(Published July or August 1994, Computer Shopper)

Your operating system is the most important piece of software that you own; it's also the least thought-of. Without it, your computer is next to useless; a humming lump of plastic and metal that is incapable of even finding its own disk drives, much less loading and running the applications you use on it. Despite being invisible to the casual user -- for the most part -- the operating system is an important part of the working environment; in fact, it is the environment.

The purpose of an operating system is to provide a uniform interface to the hardware resources of a computer, so that any piece of software written to run under the operating system can use those resources with maximum efficiency. The first level of the operating system, the kernel, contains drivers that enable it to talk to all the devices attached to the computer; the kernel also typically contains routines for accessing the file system, loading programs into memory, scheduling context-switches (if it carries out multi-tasking), swapping and paging programs into virtual memory (if it uses virtual memory), and lots of other wizardry. On top of the kernel sit all the other parephanlia of the operating system; programs to accept logins from remote terminals, interpret user commands, execute programs at a set time, communicate via e-mail, make file systems available over a network ... the list goes on.

Operating systems can be large or small. The smallest, such as that of the old ZX81, was so small that it fitted in a single ROM chip and had no capacity to deal with peripheral devices over than a tape drive; such a proto-operating system is called a monitor, and is mainly used for loading and saving programs which then have total control over the machine resources. (The monitor in this case was accessed via the Basic interpreter, also on the ROM.)

DOSing around

MS-DOS barely qualifies as an operating system if it's compared with the accepted baseline for such programs. True, it provides an interface to peripheral devices, a file system, a means of loading programs, and a range of utilities that include a shell (COMMAND.COM) that interprets typed commands. Nevertheless, it doesn't stack up in real terms; it can't multi-task or handle virtual memory, it's slightly stretched by the task of handling twenty user logins at once, and its user interface is, unfortunately, more primitive than that of the Apple Macintosh. So why has DOS been so successful -- and, in view of that success, why is everyone saying it's on the way out?

The reason can be traced back to the period 1981-1982, when IBM were designing the PC. Until then, the main operating system used by business PC's was CP/M, from Digital Research. Written in the mid-seventies, CP/M was descended from the early DEC minicomputer OS's in style. Given the limited resources of PC's in those days -- CP/M had to run on machines with only 16K of RAM and a tape drive -- the main goal of the OS was to load the program and provide it with basic i/o facilities. The typical microprocessors used in business computers of the day were 8-bit chips, the Intel 8080 and the Z80; CP/M had to fit everything -- including the application it was running -- in the 64K of memory addressable by the 16-bit pointers used by those chips. Because everybody had their own idea of how to design a personal computer, there was no standardization; disks were a maze of proprietary formats, and while character based displays were more or less universal, most of them used different escape codes to deal with little extras like cursor positioning. It was a period of chaos, brought to an end only when IBM acquired a stranglehold on the market in 1983.

The original IBM PC had a number of characteristics which seemed perfectly reasonable at the time, but which today rank among the most arcane and inexplicable design descisions in the history of personal computing.

Firstly, the microprocessor is an Intel 8086, a 16-bit version of the 8080 family; it preserves the 16-bit pointers used by the older 8-bit processor, getting around the limits on its memory addressing range by masking the top bits with another byte, to provide a 20-bit address range. (Why Intel decided on a 20-bit address range, and not 24 bits -- like Motorola, with the 68000 -- or 32 bits -- like any modern RISC chip -- is just one of Life's Mysteries.)

Secondly, the operating system, MS-DOS, is a sort of grown-up clone of CP/M. This has caused no end of trouble -- the famous 640K memory limit is just the tip of the iceberg -- but in 1980 it made sense. Business computing was dominated by CP/M. IBM wanted a computer that could run CP/M software with very little trouble. The conversion of Intel 8080 code to software that will run on an 8086 is a task that can be automated; add a CP/M derived operating system, and voila. Instant software compatability ... and another generation is doomed to run Wordstar. It's said that at one stage IBM actually approached Gary Kildall, head of Digital Research, with a view to obtaining a license to CP/M-86, which runs on the 8086; but for some reason he was unavailable, and the boys from Big Blue wandered over to Microsoft, to be greeted by an extremely interested Bill Gates. The rest is history, except for Digital Research -- which is part of Novell.

The important thing to remember about the IBM standard is that MS-DOS and the Intel 8086 were not inevitable. Intel was working on a true 32-bit microprocessor in those days; only its failure to ship in time convinced IBM to pick the technologically inferior 8088 (a cut-down 8086). They might even have waited for the Motorola 68000, which was already on course to become a success. Likewise, if IBM had been willing to abandon CP/M compatability, they could easily have selected Xenix as the operating system for their PC. With Xenix -- a version of UNIX System 7 -- and a 68000 processor, the IBM PC could have strangled the workstation market before it was born. And, even today, with the right operating system a PC can do things which might surprise to you.

Real Operating Systems Don't Eat Quiche

Times change, and so do computers. The typical modern PC, costing perhaps £1000, is based on an 80386 running at 20Mhz, with 4 Mb of RAM, an 80Mb hard disk drive, and a colour VGA display. In 1982, when the PC first appeared, the equivalent feature list was the domain of scientific workstations priced in the stratosphere; even in 1984, a PC XT with a 10Mb hard disk, 512K of RAM and a 4.77Mhz processor would have set you back four or five times as much money for a tenth or less of the performance. Yet the operating system remains the same. The stupendously powerful 386 -- as powerful as a mainframe of the late 70's -- runs an operating system descended from a system designed to run in 16K of RAM. It can't use more than the first 640K of all those megabytes, it can't take advantage of the 386's multi-tasking capabilities, and it can't even draw pretty windows on the screen.

It's not surprising that more and more people are turning to Windows as a solution to the problems of DOS. Windows offers multi-tasking, after a fashion, and a fairly useable interface, unless you happen to be used to a Mac. It even offers virtual memory: a real boon after DOS. But it's still not the real thing. Windows doesn't so much ignore DOS's deficiencies as work around them very cleverly -- which is why it's so slow. But what else can you do?

You might be surprised.

DOS is not the only solution to unlocking the power of a PC. If you need to run DOS compatible applications, there are a number of alternatives; OS/2 2.0, Concurrent DOS, DOS plus Windows, and so on. If you don't need to run DOS programs, the sky is the limit. Most operating systems provide as integral components services which DOS users have to buy as applications. For real-time operations -- where a response must be guaranteed within a set time, as in medical or safety-critical systems -- there's QN/X. For multi-user operations there are a plethora of UNIX systems and clones, from SCO and Interactive (Sunsoft) and others. The commercial UNIX field is probably worth another look, but space forbids; it's another world, one considerably bigger and stranger than that of DOS.

Most of these operating systems are relatively inaccessible if you own a PC and are coming from the DOS world. Either they're expensive or they're unfamiliar or both. Among the non DOS-like systems, the market leaders, SCO UNIX and Interactive UNIX, both start at over £500; a full UNIX development system with networking facilities and a GUI can easily cost £1-2000. On the other hand, a full UNIX OS can turn a PC into a minicomputer substitute; I regularly use a machine with fifteen other users on it at the same time; I've seen multi-processor PC's with over a hundred terminals, two hundred Mbytes of RAM, and several gigabytes of disk space -- mainframes in all but name. On the other hand, if what you want is to get a flavour of the beast, your requirements are rather more modest; a number of low cost operating systems exist. Of these, the best known are Minix and Coherent. I'll also be looking at a new contender: Linux, which is free.

Minix is a teaching system, written by professor Andrew Tannenbaum. His book, Operating systems: Design and Implementation, is one of the classic undergraduate texts; it shows -- with the aid of source code -- how Minix is put together, using it as the core of a first course in operating system design. Minix itself is very similar to UNIX System 7 (which was released around 1980); it's both efficient and elegant, and the source code is available on disk from Prentice-Hall, publishers of the book, along with executable copies for a variety of computers. Minix, until recently, was the best way to learn how operating systems work; while it doesn't have loads of advanced features, it is nevertheless a working multi-user, multi-tasking system that runs on just about anything and has a thriving following of programmers who use it as a test bed and a development environment. Compilers and software tools are available for it; however, the standard version of Minix on the PC is limited to the small memory model imposed by those pesky 16-bit pointers, so programs written on it cannot exceed 64K for code plus 64K for data.

Coming from a different direction entirely is Coherent, from Mark Williams Company. Coherent is a fully functional clone of UNIX; I reviewed the 286 version in 1990. Although Coherent doesn't come with source code and a book describing how it was put together, it is a cheap, hard-working operating system in the mould of UNIX; if you want to run an electronic mail system or develop software for UNIX, it's a great little system. Version 3.2, the standard, runs on anything from a 286 with 640K of RAM and a 10Mb hard disk, and comes with the tools to do anything you can think of. Hiowever, like Minix, version 3.2 of Coherent is limited to the small memory model. The latest version, 4.0 (released in May), is a 386 version; it claims to follow the iBCS2 binary compatability standard, so that programs compiled on most PC UNIXes should run on Coherent 4, which has a 'flat' 32-bit memory map, using the advanced features of the 386. iBCS2 is a standard that specifies how compiled programs are laid out, and what function calls they can make to the operating system; it's a definition for an executable file interface, so that (in theory) shrink-wrapped software for PC-based UNIX systems can be marketed as easily as DOS applications. (A full review of Coherent 4.0 will appear in a later issue of Computer Shopper.)

Born to Run Free

But all the systems above cost money. That's all very well if you know what you're getting into and specifically want a small UNIX-like operating system; but what if you don't? Or what if you can't afford one? What if you're a student or an enthusiast or you just want to get an in-depth feel for non-DOS operating systems without buying a commercial product? What if you want a system which is robust and runs with virtual memory and a 32-bit memory model -- unlike Minix -- but which also provides you with source code -- unlike Coherent? Up until six months ago, this would have been a daydream. But, completely unexpectedly, a new product has joined the cheap UNIXes. It's called Linux ... and it's free.

Linux is a small UNIX for 386-AT clones. Originally written by Linus Torvalds during 1991, the goal was to produce a free UNIX kernel which could be used with public-domain UNIX tools to make a totally free operating system. (It was motivated partly by the fact that the GNU Hurd -- the long-awaited operating system from the Free Software Foundation -- was taking a lot longer to turn up than anyone expected; the Hurd is barely in alpha-test, and is unlikely to show up this year.) Linux is distributed subject to the GNU "Copyleft" license; that is, you're more or less free to use it for any purpose -- as long as you don't charge for it and don't prevent anyone else from getting access to the source code.Linux is currently in beta-test, but seems to be pretty robust; a first general release will almost certainly appear before the end of 1992.

Linux 0.95(c+) implements a subset of UNIX System V, and is mostly compliant with the POSIX standard for software portability. The kernel is designed specifically to run on 386/486 ISA-bus PC's, and is pretty machine-specific; you're out of luck if you run an Amiga or a Mac (although a PD port of the BSD 4.4 operating system is apparently underway for the Amiga A3000). However, Linux looks pretty good so far. A large number of hackers have enthusiastically volunteered their efforts, and each beta release of the operating system shows more and more features. Support for all IDE hard disks, some ESDI and some SCSI hard disks (ST-01 and ST-02) are available; Linux also supports Hercules, CGA, EGA and (S)VGA monitors, and 3.5" and 5.25" floppy disks. It needs 2Mb of RAM to run -- 4Mb if you want to compile anything without it swapping to disk a lot -- and can handle up to 16Mb of RAM at present.

Linux needs two floppy disk images to boot from; a boot disk (storing an image of the executable kernel, which is loaded into the first 640K of RAM on the PC) and a root disk (containing the root file system). A basic Linux installation will probably take 10Mb of hard disk; with X-windows, 20-30Mb. (The amount varies because Linux still isn't finished; for comparison, a minimal installation of SCO UNIX 3.2v4 occupies 39Mb of disk, while OS/2 2.0 occupies 32 Mb and DOS 5 with Windows 3.1 occupies 13-14 Mb. Meanwhile, some IBM mainframe OS's occupy more than 2 Gb, or 2000 Mb!)

Like a standard UNIX system, Linux has the ability to use different types of file system. (A file system, in DOS terms, is the combination of hierarchical directories, FATs and so forth used to map absolute file positions on a disk to something the user sees as a directory hierarchy; most UNIX file systems have built-in cacheing and defragmentation, making them more efficient than the standard DOS system. You can choose the file system you want in a UNIX partition in order to fine tune for performance) At present, the only file system provided by Linux is a clone of the Minix file system; however other fast file systems are being added, as are tools to enable it to read and write DOS floppy disks.

Linux also has a sophisticated virtual memory system. Multiple instances of a program running at the same time can share code; unused pages can be paged out, or entire programs swapped out to either a swap partition or a swap file (whichever you prefer to use).

The display facilities for Linux will come as a surprise if you're used to DOS. Like all UNIX systems, Linux sees devices (things like CON: or PRN: or LPT: in DOS) as files in a directory hierarchy called /dev ; for example, /dev/hd0 would be the first hard disk drive, /dev/lp would be a line printer, and /dev/kmem is the kernel memory space. Screens in UNIX are terminal devices; for example, /dev/tty01 , where "tty" is short for teletype. So by issuing a command like echo "hi there!" > /dev/tty01 you can make the words "hi there!" appear on teletype number 1, wherever it is. (This comes in very handy if you have fifty terminals plugged into your UNIX system.) Because most Linux systems have only one user -- it's a PC system, remember -- it also makes extensive support of pseudo-tty's, or ptty's. A pseudo-tty is simply a full-screen window on the console; you flip between them by hitting [Alt]-fn, where fn is a function key bound to the desired pseudo-tty. That way, you can log in three times over and be editing a file on one pseudo-tty, running a lengthy compilation in another, and have a shell (command interpreter) available in a third; sort of like DesqView.

The biggest surprise comes when you examine the software available for Linux. One of the hallmarks of UNIX is that there's a vast amount of public domain software, some of which is so powerful that it's become part of the standard operating system. For example, the MMDF electronic mail package is pretty ubiquitous; and X-windows, the standard windowing system for UNIX (on which the GUI's like Motif and Open Look are based) is also public domain. To cap it all, rumour has it that someone is working on a DOS emulator for it. Such a program would enable Linux to run DOS applications, in much the same way that VP/iX works for UNIX. While it's a long way off, such a program would turn Linux from an enthusiast's system into a real contender.

Because Linux is capable of handling lots of memory and a 32-bit memory model, most public domain software can be pursuaded to run on it (with a bit of messing around). The GNU C++ compiler is the standard Linux compiler; using GNU C, almost all the Free Software Foundation's GNU software has been ported to Linux, including GNU EMACS (the editor to end all text editors) and GROFF (the GNU roff typesetting system) and Donald Knuth's TEX typesetting system. Even X-windows has been ported to Linux, although you need a super-VGA monitor and bags of memory to run it; 8Mb is the minimum sensible. (This is not Linux' fault; X-Windows is one of the most astonishing memory hogs ever written.)

A note of caution

Linux, as this article was written, was still in beta-test. Although it's widely available over the Internet, it's not yet reached the stage at which it's safe for inexperienced users to mess with it. The documentation is patchy and highly technical; some skill with a C compiler and some knowledge of UNIX and PC's is essential in order to get it over any installation hitches. Trying to install Linux without knowing what you're doing is a sure recipe for blowing away the contents of your hard disk drive. Nevertheless, these are teething problems; Linux is the most exciting development in public domain software this year, and looks certain to mature into a powerful, functional operating system for the rest of us -- those who are into computers for their own sake, or who want to run a UNIX system but can't afford the commercial product, or who just want access to Usenet or bulletin boards (UNIX-type systems are virtually perfect platforms for bulletin boards or e-mail systems). It will also make a great tool for teaching operating systems theory; here's one you can use at home without paying masses of money. Watch out for release 1.00, real soon now ...


[ Site Index] [ Journalism Index] [ Feedback ]