Windows was designed according to the “one computer, one desk, one user” vision ofMicrosoft’s
cofounder Bill Gates. For the sake of discussion, I’ll call this philosophy single-user. In this
arrangement, two people cannot work in parallel running (for example) Microsoft Word on the
same machine at the same time. Using Terminal Services in Windows 2000 or Windows XP
allows remote use of one computer from another but is still bound by the single-user paradigm.
The Windows .NET Server products, which are unfinished as of this writing, continue to add
terminal features to enable more than one user to access the server simultaneously.
Linux borrows its philosophy from UNIX. When UNIX was originally developed at Bell
Labs in the early 1970s, it ran on a PDP-7 computer that needed to be shared by an entire
department. It required a design that allowed multiple users to log in to the central machine at
the same time. Various people could edit documents, compile programs, and do other work
at the exact same time. The operating system on the central machine took care of the “sharing”
details, so that each user seemed to have an individual system. This multiuser tradition
continues through today, on other UNIXs as well. And since Linux’s birth in the early 1990s,
it has supported the multiuser arrangement.
Today, the most common implementation of a multiuser setup is to support servers—
systems dedicated to running large programs for use by many clients. Each member of a
department can have a smaller workstation on the desktop, with enough power for day-to-day
work. When they need to do something requiring significantly more CPU power or memory,
they can run the operation on the server.
Linux, Windows 2000, and Windows .NET Server are all capable of providing services
such as databases over the network. Users of this arrangement can be called network users,
since they are never actually logged in to the server but rather send requests to the server. The server does the work and then sends the results back to the user via the network. The catch in this case is that an application must be specifically written to perform such server/client duties. Under Linux, a user can run any program allowed by the system administrator on the server without having to redesign that program. Most users find the ability to run arbitrary programs on other machines to be of significant benefit.
cofounder Bill Gates. For the sake of discussion, I’ll call this philosophy single-user. In this
arrangement, two people cannot work in parallel running (for example) Microsoft Word on the
same machine at the same time. Using Terminal Services in Windows 2000 or Windows XP
allows remote use of one computer from another but is still bound by the single-user paradigm.
The Windows .NET Server products, which are unfinished as of this writing, continue to add
terminal features to enable more than one user to access the server simultaneously.
Linux borrows its philosophy from UNIX. When UNIX was originally developed at Bell
Labs in the early 1970s, it ran on a PDP-7 computer that needed to be shared by an entire
department. It required a design that allowed multiple users to log in to the central machine at
the same time. Various people could edit documents, compile programs, and do other work
at the exact same time. The operating system on the central machine took care of the “sharing”
details, so that each user seemed to have an individual system. This multiuser tradition
continues through today, on other UNIXs as well. And since Linux’s birth in the early 1990s,
it has supported the multiuser arrangement.
Today, the most common implementation of a multiuser setup is to support servers—
systems dedicated to running large programs for use by many clients. Each member of a
department can have a smaller workstation on the desktop, with enough power for day-to-day
work. When they need to do something requiring significantly more CPU power or memory,
they can run the operation on the server.
Linux, Windows 2000, and Windows .NET Server are all capable of providing services
such as databases over the network. Users of this arrangement can be called network users,
since they are never actually logged in to the server but rather send requests to the server. The server does the work and then sends the results back to the user via the network. The catch in this case is that an application must be specifically written to perform such server/client duties. Under Linux, a user can run any program allowed by the system administrator on the server without having to redesign that program. Most users find the ability to run arbitrary programs on other machines to be of significant benefit.