Back to Computers | Back to Home |
Workstation (Windows, Macintosh, UNIX, Linux, DOS)
Server (Windows, Linux, UNIX, Netware, Macintosh)
Mainframe (UNIX, Linux, proprietary)
There is some overlap, and a bit of cross functionality, but these designations would be familiar to most users. You might notice that there are only two operating systems, which can be used on all three types of platforms. Until very recently, there was only one; this was UNIX. The other operating system seen in all three types of computing is Linux, the new kid on the block and a UNIX variant.
Unix is universal on mainframes, and is by far the dominant OS on servers (if Linux is included). Even some proprietary mainframe operating systems (such as AIX) are only proprietary to the extent that they are non-standard variants of UNIX, with features added or emphasis on a particular set of tasks or a particular computer architecture (as with UNICOS, the OS for the new generation of Cray computers, and it’s predecessor COS, also a variant of UNIX).
UNIX has owned the mainframe, and minicomputer markets since the mid seventies, when it was largely the only thing out there. It was without a doubt the only OS capable of running on multiple hardware platforms. It was also the only OS with which many of the upcoming generation of computer students were familiar. UNIX is, since the retirement of MULTICS in 2000, the only survivor of the first generation of operating systems.
So ok, UNIX is historically important and has the mainframe and perhaps the server markets all sewn up, but everyone in business and at home uses Windows, with perhaps a small portion using Macintosh, and a dedicated bunch using Linux or some other esoteric system. This hardly makes UNIX, even if Linux is included as a variant, the dominant operating system. Netware once owned the small server market, but is now little more than a curiosity, having largely lost it’s dominance of the small/medium server market to Windows. So dominance in the server market hardly guarantees success in the market at large. Why might Linux/UNIX become the dominant operating systems of the future?
The basic functions of a computer operating system are to run programs, keep track of data, run the computer hardware, and present a viable user interface. These days, an OS is also expected to have networking features, utilities for maintaining and improving the efficiency of the machine, back up capabilities, and embody the ideal of being “user friendly”. The concept of an operating system is so pervasive these days that it is sometimes difficult to realize that it is a development of the sixties, and was not really in place until the seventies.
Early computers had no operating systems, and their operation would seem strange and cumbersome to us. In order to understand the revolutionary changes wrought by the introduction of the operating system, it might be entertaining to look at the ways in which computers were originally used, and what the concept of the computer was before the advent of the operating system. This will also help to understand how something, which was not even a concept in the early sixties, became a necessity by the early seventies. An understanding of the introduction of the operating system may explain how the computer professionals and users lost control of their machines to a few monolithic corporations, through dominance of a small number of operating systems, and how they may be in the process of getting this control back through the new open source concept.
Credit for the first electronic computer generally
goes to ENIAC, which was a joint effort (as were most early
computer projects) of the Department of Defense (then know
as the War Department), and a research agency (in this case,
the
The ENIAC contained 17,468 vacuum tubes, along with 70,000
resistors, 10,000 capacitors, 1,500 relays, 6,000 manual
switches and 5 million soldered joints. It covered 1800
square feet (167 square meters) of floor space, weighed 30
tons, consumed 160 kilowatts of electrical power, and, when
turned on, caused the city of
By today’s standards, this was not a very powerful machine, being little more than a glorified calculator. In point of fact, the whole concept of the computer as a data processing machine, as we know it today, did not really gel until sometime in the sixties. Up to that time, they were generally considered to be super versions of calculators, and tabulating machines. The term “computer” back in those days, was understood to refer to one of the programmers, or operators of these machines, rather than the machine itself.
I am using ENIAC as an example because it is so typical of what computing was before the sixties. These machines really did not require operating systems, as we know them today. They eventually needed programming languages, in order that they might be more easily instructed in performing a wider variety of tasks, but even as tubes began to replace relays in the operation of these machines, they were more like modular computing kits, than programmable computers. Programming was a chore during which the computer was rebuilt:
The program was set up manually by varying switches and cable connections. However, means for altering the program and repeating its iterative steps were built into the master programmer. Digit trays, long racks of coaxial cables, carried the data from one functioning unit to another. Program trays, similarly, transferred instructions; i.e., programs. In purely repetitive calculations the basic computing sequence was set by hand. The master programmer automatically controlled repetition and changed the sequence as required Programming new problems meant weeks of checking and set-up time, for the ENIAC was designed as a general-purpose computer with logical changes provided by plug-and-socket connections between accumulators, function tables, and input-output units.
ENIAC remained a powerful, first line computing tool until better machines were made in 1952. It was removed from service in 1955. ENIAC was rebuilt many times during it’s dozen or so years of service. It’s modular design made upgrading easy, and really through it’s entire operating life it remained a work in progress.
Thus stood the state of the art in electronic computing until the invention of the transistor, and the development of the operating system, which together allowed for a huge increase in power flexibility and compactness. This would be the beginning of our present day concept of what a computer is, what it should do, and how it should be used.
In the beginning of computing as we know it today, there was Unix. Well, not really, but Unix does date from the transition period of the computer age. This was when a computer was still a large, expensive, incomprehensible machine, ministered to by the chosen few; but smaller, less expensive, and more flexible machines were obviously in the pipeline. User friendly was not a concept here, during the sixties and the early part of the seventies. A pretty good case can (and with any luck, will) be made that UNIX was the original platform independent operating system, and that it’s largely unrecognized parent, MULTICS was the beginning of the concept of an operating system, as used in all modern computers.
What makes Unix, and the late sixties/early seventies, such a good starting point for this journey is that this is truly where modern PC style computing was born. DOS was highly influenced by Unix, while Windows was highly influenced (some say plagiarized) from work done at Xerox’s PARC facilities in the sixties. It is also a fact that most of the coming giants of the computer industry as well as all of the people who wrote the code were finding their feet, and developing their craft at this time. This was also the period when many of the standards we now take for granted were established.
On top of all of this, by the sixties computers were becoming accessible; you no longer had to be part of the government, big business, or a special project, in order to get your hands on one. There were even some people who began to play games on these formerly sacred machines. There were also now a number of companies, which produced computers including DEC, IBM, Honeywell, Digital Research (started by Seymour Cray), CDC (also started by Seymour Cray), Sperry, and a few others. These companies were actually selling computers as tools for business and laboratory work. Computers were now becoming products, rather than projects.
A staffer at
It's worth remembering how one ran a FORTRAN job in the early days. First you punched your FORTRAN program on a key punch machine, along with any data and control cards. But since the [IBM] 650 had no disk, the FORTRAN compiler was not resident. So to compile your program, you fed the FORTRAN compiler deck into the card reader, followed by your FORTRAN source program as data. After some time, the machine would punch the resulting object deck. Then you fed the FORTRAN run-time library object deck and your program's object deck into the card reader, followed by any data cards for your program. Your program would run and results would be punched onto yet another deck of cards. To see the results, you would feed the result deck into another machine, such as an IBM 407, to have it printed on paper. The computer itself had no printer.
By the early sixties, FORTRAN, and COBOL were both being used, but computer operators, and programmers were still stuck with archaic means of communicating with their machines, as another early user relates:
By the early 60s a
certain division of labor had become the rule, in which
"system analysts" would make a flow chart, programmers would
translate it to code, which was written by hand on "coding
forms" that were given to key punch operators to be
punched on cards. The coding forms and card decks were
passed on to "verifiers" who repunched the source code to
catch and correct any mistakes, signed off on the job, sent
the deck to the operator to await its turn at the computer.
Hours later the results would be delivered to the programmer
in the form of a printout and the cycle would continue.
[iii]
The result, for the user was
inconvenient to say the least (
:
Access to computing was batch only. Users brought decks or boxes of punch cards to the operators and came back the next day to retrieve their cards and the resulting listings from the output bins. Jobs were paid for out of grants or funny money. There were no user terminals and there was no user access to the machine room, which was staffed around the clock by operators and a shift supervisor
As if all of this were not bad enough, more difficulties arose as computers gained widespread use, and particularly as users began to replace older machines with newer, better, faster models. As computers began to do more things, and more people made use of them in more places, it became obvious that there should be some way to easily move data from one computer to another. This was harder than it might seem, from today’s vantage. Though computers might occasionally be connected, there was nothing like today’s ability to network (Ethernet was born on May 22, 1973). The only media available was the punched card, which was extremely slow to load, but there were even more complications. Everything was proprietary in those days; there were no standards. Cards punched for use on one machine might be unreadable on another. The only way to transfer data was to go back to the original code, have new sets of cards punched out for the new machine, and then feed them in. Tedious does not even begin to describe this process. Computers needed a way to talk to each other, or at the very least, data needed to be made portable.
It also seemed like there should be some way of giving users better access to computer time, than the old batch system. Computers tended to be much faster than their human operators, so why should the computer wait while cards are read into it? Surely there must be a way to give easy access to remote users so that jobs could be stacked up, or time could be shared.
There was nothing like today’s concept of an operating system back then. Computers were fed their compilers, their run libraries, and their data via card readers, and the whole thing was started and stopped via console switches. Magnetic drum media, card readers, sorters, and other such bits of the system were controlled by sets of individual utilities run as separate programs. Though there was a programming interface of sorts via machine language, or a compiled language like FORTRAN, there was nothing like a user interface. The computer was switched on (or cleared), cards were loaded, and it was run until it was finished, at which time the cards holding the answer would be removed, and read, and the whole thing would be started over again with a new batch of cards. Programmers, particularly if they used machine language, would have to learn the vagaries of each machine with which they worked. If they worked in a particular language, they would need to know that it was available in a version that they knew, for a particular machine.
In general, it
was becoming obvious that sets of standards would have to be
arrived at. It was one thing to have a few monolithic
computers doing things their own way, while being supported
by a huge staff of technicians, and quite another to have
hundreds of machines doing things in hundreds of different
ways. This electronic
MULTICS, the father of the Operating
System.
In 1964 G.E.,
Bell Labs, and MIT collaborated on MULTICS
(Multiplexed Information and
Computing Service). This was done in order to address the
potential of the new generation of computing equipment. The
idea for this had been circulating around since the late
fifties, and was fueled by several deficiencies (mentioned
above) in what we would now call the operating environment.
In anticipation of the requirements of a whole new
generation of computers, MULTICS was designed around 9
goals:
Remote terminal accessible
Continuous operation over extended periods (no down
time)
Must be scalable and flexible to work in different
hardware configurations.
A high reliability internal file system.
Support for selective information sharing.
Hierarchical structures of information for system
administration and decentralization of user activities.
Support for a wide range of applications.
Support for multiple programming environments & human
interfaces
The ability to evolve the system with changes in
technology and in user aspirations.
The design goals of MULTICS speak volumes about all of the things that computers were incapable of doing in those days. Most of these things are taken for granted now. Many of these deficiencies had not been an issue earlier, because previous systems had not had the speed or capabilities of the newer generation. So if MULTICS was such a great system, and was the true parent of much of what we consider to be the modern computer operating environment, why is this paper about UNIX and not MULTICS?
MULTICS had a couple of great drawbacks, and also had the misfortune of costing more, taking longer, and being a bit slower than was originally predicted. The main drawbacks of MULTICS, from a marketing point of view were:
Expensive
Required quite specialized hardware, and was not really platform independent.
Was owned by Honeywell, and latter the French firm Bull, which seemed unable to figure out quite what to do with it.
It was also
saddled with the stigma of having been abandoned by
MULTICS has
been dismissed by most computer histories as failed,
abandoned, a disaster, or as having been a learning
environment. It is generally only mentioned, when any note
is taken of it at all, as the
Essentially then, MULTICS is a dead end; but UNIX, it’s half brother, still lives.
Ken Thompson,
and Dennis Ritchie were two members of the MULTICS team, who
worked for
Bell Labs blows it’s own horn a bit, about UNIX snatching victory from the jaws of the defeat which was MULITCS. In point of fact, MULTICS worked fine, but it did not deliver as quickly or as cheaply, the type of performance that those in charge of Bell Labs desired. It is also a fact that Thompson and Ritchie received little support in their early efforts, and were reduced to developing UNIX in a rather piecemeal way. This created consequences (a certain oddness in the syntax and command list of UNIX), with which we are still living today.
The initial concept of UNIX was as a scaled down version of, what Ritchie and Thompson had considered, a far too ambitious MULTICS. They called it Unics. Depending on the source, this is said to stand for UNiplexed Information and Computing Service, or an emasculated Multics'. Take your pick. This was later changed to UNIX.
UNIX was
initially worked out on a PDP-7 minicomputer. This was soon
found inadequate, and most of the work was moved, in 1970 to
a new PDP-11. By this time
The initial work on UNIX, it’s quirky commands, and the non standard switches are the result of the unstructured way in which it was developed. In contrast to the well financed, sharply organized concerted effort put forth on behalf of MULTICS, UNIX was a bit of a shoestring development, which was programmed by Thompson and Ritchie by the seat of their pants. One thing UNIX really had going for it was portability:
In general UNIX system developers and application developers program in the same language using the same application programming interface. In typical proprietary operating systems, the operating systems programmers are programming in assembly language and have access to a many capabilities, which are not available to the application developer.
This was heady stuff in the sixties, and as a result, UNIX has bounced around a bit, and been added to by a number of different teams, and programmers. Here was a real, versatile operating system that could be compiled for use on any computer, and which will run any program written for it, on any platform There were versions for Sun, HP, IBM(AIX), SG, and Cray, as well as the yet to come UNIX favorite son, LINUX. So while UNIX was coming to dominate in the world of “Real” computers, what was going on in the newly conceived world of the Microcomputer.
The Microcomputer (affectionately known as
the PC or personal computer)
Computing takes a giant step backwards
Advances in electronics, and some of the
technological inventions related to the space program
resulted in miniaturization of a number of components.
The initial purpose of these chips was to be for use in automated devices, like CNC machines, cars, automated inventory systems and the like. It was planned that eventually they would be used in “smart” devices for the new computerized home of the future, that had been foreseen ever since the late forties. One thing that they would surely not be used for would be “real” computers. Of course we all know what happened next.
The early days of what were then called microcomputers were chaotic at best. This was a market that had never been conceived or planned for, and much of the history of these little machines reads like a repeat of what had happened in mainframe development. All of this first generation of machines were quirky, buggy, self built, hard to use, and expensive. Early microcomputers were programmed, by flipping switches on a front panel. Initially programming languages were offered, and then very primitive, and proprietary operating systems. Microcomputers had to be booted by loading software manually. Programs were saved, by writing them down, by buying a very expensive paper tape machine, or by getting an A/D converter card and saving the programs by dumping them onto cassette tapes.
These machines appealed to hobbyists, computer
professionals, and technophiles. There was little or no
software available for them, leaving users to write their
own. Most machines could not talk to each other, nor
understand programs written for other machines. The
Two events occurred which rescued microcomputers from the world of the tinkerer, and made the future of the home PC possible. The first was the introduction of the Apple II. What made the Apple unique was it’s completeness. You could buy an Apple, open the box, plug it in, attach any accessories, load the included OS, and run your software. You did not have to assemble the computer from a parts kit, write an OS, or program it through panel switches. Eventually prewritten software was available. In particular Wordstar, and VisiCalc were written for the Apple. These two programs answered the often-asked question “ok, it’s very neat, but what can you do with the thing?”
The Apple used the 6502 CPU, while most other computers used the newer Zilog Z80. The Z80 machines were better in most ways, but they were produced by a number of manufacturers, using a variety of operating systems and different types of system bios configurations to run them. This made buying a computer, particularly for business use, a nightmare of indecision. Computer makers scrambled to get software written for their machines, to get drivers written, and to get operating systems functional.
The cure for this was the first real universal operating system since UNIX. It was written by Digital Research, and was called CP/M (Control Program Manager). CPM, like AppleDOS was highly influenced by UNIX. Though the commands were different, the operating mode, and the concept were the same, though far more limited. CPM, and the later CPM 86, looked like they were going to own the small computer market for the foreseeable future (as does Windows today), but as often happens in the IT world, unforeseen circumstances prevailed.
The Giant Awakens
There had been conjecture for several years, about “real” computer manufacturers getting involved in the burgeoning microcomputer market. Rumors began to circulate about IBM introducing a small computer, which it did in 1981. Though no one knew quite what to expect, there was a certain amount of disappointment, when the machine was unveiled. The new PC (A term newly coined by IBM) used a dumbed down version of the Intel 8086, a processor which had been around since 1973, called the 8088. It had a maximum memory amount of 640k, and generally came with 128 k on board. The first machines had no hard drive, and were incapable of graphics, as we know them today.
IBM was being very conservative in the design of this platform, but they did do one thing right. They made the hardware platform an open design, and invited outsiders to develop for it. This was in contrast to Apple, which was (and continues to be) very protective of it’s platform. It was obvious that, within a very few years, CPM, and even Apple were to be upstaged by the new PC. The rest, as they say, is history, with the PC platform becoming very much a universal hardware, computing platform.
Digital Research, the makers of CPM, were approached by IBM, with an offer to design an operating system for the new machines. Unenthusiastic about designing a competitive product for their own king of the microcomputer world, Digital Research said no. This has to qualify as one of the great blunders of the computer industry. The project was then offered to a startup company called Microsoft. In part this may have been influenced by the position of Bill Gate’s mother, who sat on the board of directors of IBM. Digital Research has since fallen on hard times.
Technical people were under whelmed by the less than leading edge technology of the new IBM PC. IBM was, during this same time frame, producing small computers such as the 6000 series, which were far more powerful, and much more sophisticated than the new PC. As would become all too obvious, IBM never really took the PC market seriously, and always thought of these machines as being entry-level products, with which to lure customers over to their real computers. Bill Gates, and Microsoft had a different view of things, though they too, were a bit narrowly focused.
Microsoft needed this new project to put themselves into the big leagues, but they had nothing to offer to IBM. Bill Gates last project had been the design of the BIOS for the Radio Shack TRS-80 computer. MS DOS was heavily based on a very simple operating system that Gates bought from Seattle Computer Products in 1981 just before IBM launched its PC. This had initially been named QDOS, which stood for Quick and Dirty Operating System. Gates changed it to MS-DOS (MicroSoft Disk Operating System). Early testing and introduction was a failure, and IBM had to train Gates and his team, in how to develop software, and write documentation. Both QDOS, and MS-DOS were heavily influenced by (surprise) UNIX.
It became obvious that IBM did not consider the PC market very seriously, certainly not as much as Gates and Microsoft. When Bill gates offered to sell DOS to IBM for $100k, so that he could enlarge his new company, the offer was turned down. In fact, IBM was beginning to look past Dos, and a joint effort was begun in 1984 to create a new 16-bit system called OS2, which was released in 1987.
The new product was unveiled at a Comdex show. Bill Gates was there, along with a number of members of his team. This was to be a great event for Microsoft, but the IBM rep who was supposed to show and give a long presentation sent a subordinate, who said a few words and left. The Microsoft people had been snubbed, and made to look rather foolish. The collaboration would last another three years, before the companies severed relations, but this was probably the point at which Gates realized that the future of his company did not lie in a junior partnership with IBM.
Microsoft had been working on a graphic interface it called Interface Manager, since late in 1981. It had been loosely based on an improved version of the dos shell menu system. Version 1 was announced in 1983 but did not begin to ship until 1985, just after the beginning of the OS2 project, but there was little interest in the product.
In 1988, a similar program called Presentation Manager had been designed for OS2. Graphic user interfaces had been around since the PARC projects of the 60’s, but there were now several on the way for the home and office user from Xerox, Apple, Digital Research, Amiga, and even Tandy. Of all of these systems, only OS2, and Amiga were multitasking.
When IBM and Microsoft called it quits in 1990, both companies took what they had been working on in the joint project. IBM continued to develop it’s OS2 into the Warp series, while Microsoft worked it’s version into what it began to call NT. IBM allowed OS2 to wither and lapse into a coma, while Microsoft developed NT into 2000, and XP, and has managed, through it’s marketing efforts, to dominate the small computer OS market. This is despite the fact that the IBM version of OS2 was initially a better system. Yet another case of unforeseen circumstances taking out the leading contender.
Windows, the GUI, and a computer on
every desktop, and in every home.
When Microsoft pulled away from it’s IBM partnership, it did several things which IBM would have done well to emulate.
· The idea of a “user friendly” machine was embraced, and a graphic interface was designed appropriately.
· Development kits were given away free to encourage software creation for the new Operating system.
· Microsoft, along with Apple, realized the potential of the home and school market, and catered to it.
· With Windows for Workgroups, networking was integrated into the GUI.
· With Windows 95, the idea of protected memory, virtual memory, true multitasking, and of having all of the hardware controlled by the OS was introduced. NT took this idea even farther. Note that these were all features that UNIX had boasted for two decades, but they were now available for the PC class machine, along with a 32-bit architecture.
· The idea of the mini driver, to control all hardware devices through the OS was introduced.
This last point is possibly the most important of all, because one of the many problems facing pc users had initially been the variety of printer languages and codes. This was not so bad in the early days, when daisy wheel printers, or the primitive (by present standards) dot matrix printers were essentially text only. As computers came to be used for more graphic intensive tasks, there were problems getting certain printers to talk to certain programs. Other hardware was also becoming more capable, and thus more demanding. High performance graphics cards by Hercules, and then others began to appear, as did sound cards, and various forms of mass and portable storage.
Before the use of a mini driver, DOS programs would each control the hardware individually. This meant that a given program would have to have drivers written for it so that it could use the printer, and graphics card, if special ones were needed. All the OS of the day did was interact with the computer, the data and the program itself. If you needed to access any other hardware, each individual program was on it’s own. More than any other feature, the use of the mini driver is what probably assured the dominance of Microsoft Windows. This is also what turned it into a closed system, since software developers now had to write Windows Compatible programs, and hardware manufacturers had to write windows compatible drivers, and produce Windows compatible devices. These programs and drivers would not run on other operating systems.
One of the reasons that IBM did none of these things was that it still did not take small computers seriously. The marketing department fit them into it’s product line, and these, the smallest and cheapest of the IBM line, were put at the bottom of the list. Microsoft had an entirely different perception, and strove to increase the capacity of the Windows operating system as much as possible.
With the
exception of user friendliness and the graphic interface,
none of the features offered by Windows had been at all
unusual on full sized computers. The reason they were now
available on smaller PC class machines has to do with that
favorite of the hardware hacker,
No early programmer would have palmed off the type of bloatware being written today. Reminiscences of old programmers almost sound like the stories that people who grew up in the depression era used to tell their kids and grandkids (When I was a programmer, we only had 2k of core memory and we were GLAD to get it!) As part of it’s user-friendly ideal, Windows was full of bells, whistles and fluff. It was also getting quite flaky, and unstable. In part this was due to the requirement to interact with software written by third party developers and hardware manufacturers, but it was also partly due to the unwieldy nature of the system and the size to which it had grown.
Microsoft got
sloppy because
The Computer as a home appliance
The huge market created by Microsoft and the PC clone manufacturers brought the computer into nearly every office, and over time, nearly every home. This market catered to a different kind of customer than the old technical savvy computer professionals. These new “users” often did not know or even care about the workings of their computers. In some cases there was a computer phobia to be dealt with. For most of these people a computer was, like a car: a tool with which to get a job done. The hacker, like his automobile counterpart the motor head, tended to be younger, and in the minority. Most people liked windows because it’s point and click, visual interface was easy to operate, and non-threatening to the new user.
The open nature of the PC architecture, along with the successful reverse engineering of the system BIOS by outfits like Phoenix, took most of the PC market away from IBM, which was destroying market share by marketing to individuals in the same way it had marketed to corporations, and split it wide open. The new market was to be dominated by companies like Compaq, HP, Dell, and latter on Gateway, along with thousands (maybe millions) of little companies producing generic “white box” computers. One thing all of these machines had in common was their dependence on Microsoft operating systems.
In many ways, these new computers became victims of their own success. As the prices dropped, support grew worse, and the specialty computer store became a thing of the past. To make matters worse, inexpert (or unethical) sales clerks over sold these machines, claiming an ease of use, and dependability that we still don’t have today. The computer was presented as a sort of home appliance, which was just plugged in, turned on, and operated. The Internet was represented as a sort of interactive version of television.
Microsoft grew to it’s present size, not so much because it did everything right, but because it visualized a market, and did fewer things wrong. It won it’s place by default, benefiting from terrible feats of misjudgment by IBM, Apple, and Digital Research, to name just a few. As a reward for it’s good behavior, Microsoft now owns the small computer OS and software markets. When it speaks, developers, manufacturers, and competitors listen.
While all of the “users” were flocking to Microsoft in droves, the more technically astute were sticking with UNIX, or one of the more esoteric languages that seems to be popping up all of the time. It is also a fact that before NT4, Microsoft had little or no presence at all in the server or large computer market. Most serious network administrators, particularly in larger environments, use UNIX servers for their most critical tasks. In many cases, NT or 2000 servers are used only to support the use of Microsoft domains on the workstations used throughout the network. Even here, though, some administrators prefer the use of the Novell context/server, or the use of UNIX permissions, and groups to the need for adapting the Microsoft schema for their network.
Though it has been vastly improved, Microsoft servers, and Windows products in general, still have nagging security flaws. In large part, this is due to the Microsoft philosophy of making setup and use as easy as possible, catering to the needs of the user class. This would be fine except that it leaves most Microsoft operating systems wide open for attack and infiltration.
Many network administrators figure it is a good idea to restart their Windows server every weekend, if it hasn’t needed restarting earlier in the week “just in case”. In contrast to this, there are Netware, and UNIX systems which have been up for months and even years at a time. The machines are getting more robust, but are still easily crashed compared to UNIX, Novell, and the latest version of the Macintosh. Microsoft has tightened up things a bit, but there are still way too many parts of the driver, and hardware design process that are out of their control. This is yet another problem caused by the emphasis on user priorities and the attempt to be all things to all people.
For the home user, Microsoft licensing might seem pretty straightforward, until you upgrade your computer, crash your hard drive, or get a constant kernel error which requires you to reinstall the operating system. Any of these scenarios can involve the home user in a battle with the new Microsoft activation system. Though irritating, these problems are generally solved by calls to Microsoft support, provided that you have kept all of the paperwork that came with the OS.
For businesses the situation is a bit more complex. The ever-changing Microsoft licensing structure is getting so complex, that many larger companies have licensing experts on staff to handle issues of license compliance. Microsoft has special licensing experts who, for a fee, recommend the most advantageous licensing option for a company, and check for compliance. Microsoft licenses by machine, site, connection, installation, or some combination of these. They are also offering subscription services, insuring long-term dependence on their products by corporate users. These tactics, the money they cost, and the work they create, are causing many businesses to consider alternate operating systems. It is also a slap in the face to be charged a fee for each operating system installation, and then another fee for each seat or concurrent connection. No one else does this. Novell charges by the connection, and allows as many server installations as are desired. UNIX charges by the installation, and levy’s no connection charge. Linux is, of course, free.
UNIX, and Novell can support hundreds of machines, per server, provided the hardware is up to the task. Windows, on the other hand, is rated by Microsoft at 12 to 20 machines per server, and there are some who argue that even this is optimistic, particularly in busy environments. In a less costly operating system this might be acceptable, but Windows is as expensive as any to buy, and probably the most expensive of the top four (Unix, Netware, Linux, Windows) to run. This is particularly true if downtime is taken into consideration. This is something that Netware is constantly pummeling Windows over. Of the major operating systems, Windows has the highest Total Cost of Ownership.
In the early nineties, a good computer set you back $2000 to $3000. Today a good computer will cost you $600 to $1200. In the early nineties a Windows operating system sold for $50 to $75. Today, Windows 2000 will cost you $300. I think that nothing more really needs to be said on this subject.
This section was not meant to pick on Windows, but to illustrate that the system does have it’s problems, and that Microsoft seems to be growing complacent in everything except marketing. It seems that Microsoft is trying to make up for all of the mistakes it didn’t make decades ago. Still, Microsoft is the desktop of choice on an estimated 88% (2001 numbers). Macintosh makes up another 8% of the desktop and home market, while UNIX, Linux, and all the rest constitute the remaining 5%. Though there may seem to be little to worry about for Microsoft, the company sees it’s position threatened.
Moore’s law, and UNIX
Microsoft was the first beneficiary of
Windows had always been at a slight disadvantage, in comparison to full sized computer operating systems, because it had taken the opposite approach. That is to say, Windows started small, as a development (though this was always denied by Microsoft) of 8 bit DOS, and latter the hybrid 8/16 bit Windows 3 series. It attempted to build these up into a full powered 32-bit operating system in Windows 95, by a process of accretion. This is not the best way to build an operating system, and explains in part the size and instability of some of the Windows systems. Microsoft has also found it a chore to maintain compatibility with it’s older operating systems, while retaining stability. It may also be, that in integrating it’s GUI so tightly with the OS, Microsoft has doomed Windows to be forever unstable.
Even Microsoft’s newest series of operating systems (NT, 2000, XP) are somewhat hamstrung to the extent that they are all outgrowths of the late 80’s collaboration with IBM, and were designed as small-scale imitations of UNIX. Today’s computer user has memory, and hard drive options that would have been unheard of in a mainframe of a decade ago. Thus UNIX is now as viable an option for the PC user as it has always been for the mainframe and server operator. You no longer have to settle for a scaled down OS inspired by UNIX. Why though, would anyone wish to make the switch?
Why UNIX?
Multiplatform, can be ported over to nearly any hardware
Powerful features, can be used almost like a programming language
Flexible user interface (Unix can run multiple user environments, and shells)
Multiple concurrent users are supported.
X window features allow for remote log in, and different operating environments.
Quite secure
Quite stable
Integrated networking
Common, and well known in the industry
There are variants, but all can be easily mastered just as a driver can learn the various quirks of different types of cars
In contrast to multics, Unix was widely licensed to many universities, and was even free at one time, thus an entire generation of computer professionals were exposed to it on a variety of machines.
Native 32 (or 64) bit OS
Tight, efficient, code, runs faster and more efficiently on fewer resources than latest versions of Windows.
Highly scalable, and lends itself to integration of multiple computers.
· When DARPA formed ARPANET the predecessor of the Internet, it was based on Unix servers.
Windows dominance of the PC Market has made UNIX an unfamiliar OS for most users.
Windows dominance of the PC market has made UNIX user applications scarce.
Some versions of UNIX are still very expensive, though with Windows now selling for $300, certain UNIX variants selling for under $100, and Solaris being offered for free, this is no longer really an issue.
A bit more difficult to use, particularly for set up and configuration. You have to know what you are doing.
It occurred to
Linus Torvald, studying as a graduate computer student in
In 1991, about the time Windows 3.1 was being released, Linus Torvald quietly posted MINIX, which we know today as Linux. Redhat Linux, today’s most popular version, and the easiest to install (though I will get arguments on that from some) was released in 1994 Just after Windows for Workgroups was shipped. Little note was taken of these events, at first, except by the same types of people who had started the original microcomputer revolution in the first place. These hobbyists and hackers gave the new system a try, and were pleasantly surprised. Though some fault was found, an amazing thing happened. They were able to correct these faults, and recompile the operating system, themselves. They could also rewrite sections of it and add features or functionality. This was unheard of in any other operating system.
Few critical machines have ever run any version of Windows, and all of the world’s most powerful computers have always run some variant of UNIX. It is, and has been for decades, the most pervasive operating system used on larger computers, and networked systems. Now, in it’s latest incarnation as LINUX, it is taking on the desktop dominance of Microsoft.
Sun, IBM, and Novell all have Linux distributions out there, for their servers. Compaq, Dell, Gateway, and IBM, along with some others, all offer preinstalled Linux (generally Redhat) on their retail computer systems. Wal-Mart stores are now offering a machine preloaded with Linux, and Open Office. It seems that a number of companies are jumping on the bandwagon to offer Linux to the consumer. Corel had a version out for a time, and a number of other Linux distributions are offered by a variety of vendors.
Linux offers all of the advantages of UNIX plus the following:
Open source (Public source code).
Not owned or controlled by anyone
Copy left (free)
Constant improvements, and peer review.
A number of graphic interfaces are now available, making Linux nearly as user friendly as Windows.
Why Bother to
change at all?
Information and the means of distributing it want to be
free, impediment of this natural tendency only comes at
great cost and is doomed to failure. The old
Windows is like a nice new passenger car. Fortunes are spent on designing the interior, selecting fabrics, mixing colors for paints, designing a stereo, and good air conditioning. Little I spent, by comparison, on the mechanical systems. UNIX is a Mack truck. Kind of ugly, not very easy to drive, until you are trained to it, maybe not so comfortable inside as the newest passenger cars. The Mack Truck is far more capable, long lasting, better built, and more versatile, but the passenger car is pretty, and easy to drive. So how many people drive Mack trucks, and how many people drive passenger cars? Linux, and the new GUI enhanced versions of UNIX are offering us nicely appointed Mack trucks, for the same cost (or less) than that of a regular passenger car.
Linux proponents claim a user base of 11 million, which
seems a bit optimistic to me. Still, these types of numbers
are probably not too far in the future, particularly in
Microsoft is not a bad company, and Bill Gates is not an evil man. The computer community owes the company a great debt, for bringing the computer out into the main stream. They had a vision, and profited because of their insight. It is unfortunate that they were alone in their vision, and earned themselves a nearly uncontested market. Domination of any market or field of endeavor, by a single player, is unhealthy for the industry and for the public at large. This is guaranteed to stifle innovation, slow progress, and leave the world with a single vision, rather than a diversity of views. This is what the return to UNIX/Linux will hopefully save us from.