Marty Cawthon, presented a talk at the CAMP Conference on May 6 in Rosemont Illinois (near Chicago O'Hare Airport).
You know how to spell 'unix' and that's about it,
but you would really like to know:
Marty Cawthon is the president and cofounder of ChipChat, a provider of Internet, Software and Solution services for companies and organizations. ChipChat has facilities in Dearborn Michigan, Mentor Ohio, and Koga Fukuoka Japan.
You may contact ChipChat at telephone 313-565-4000 or by email to Marty Cawthon and ask if ChipChat can be of help for your next Internet related project.
Some comments from the attendees who heard the talk at the CAMP Conference:
Marty's talk - very helpful.
Review of history was not new for me, but the anecdotal examples and arguments in favor of higher quality of open-source software will be very useful in upcoming discussion in my office.
Excellent presentation, very informative. Appreciate the background historical information. Also appreciate the analogies which help non-technical people understand the point; And make the point more memorable to the techies.
Marty Cawthon as usual did his typical fine job. Good informative session. His enthusiasm for Unix certainly came thru.
Marty was very good. I'm not a Unix person, but enjoyed his presentation & didn't feel left out. Good basic information.
The actual talk was given to "rave reviews" (see above). But those of you who missed it need not be despondent as this talk lives on, with enhancements, here at this web site. I hope you find these comments stimulating and interesting. I appreciate your feedback (see the bottom of this page).
I am building this on-line version as time permits. It is not yet finished.
It was just a few years ago that Microsoft executives and the computer trade press were all gushing about "how awesome Windows NT was going to be".
Windows NT was going to kill off IBM's OS/2, Novell Netware, and all those pesky Unix machines with their 'difficult-to use' text configuration files. Next would be the IBM AS/400 machines and of course, eventually the mainframes.
Stewart Alsop, then Executive Editor of Infoworld went so far as to predict the exact date when the last mainframe would be unplugged. Sometime in 1994, as I recall.
Here we are in May 1999 and interest in Unix is higher than it has ever been. Incredibly, books about "unix", "emacs", and "perl" are selling well.
What has happened?
I believe that the increased interest in Unix-like operating systems is driven by three factors:
The Internet is biggest thing in the past fifty years.
Is there any company of significance that does not yet have a web page? I don't think so.
IBM gushes about their "e-commerce" while Hewlett Packard talks up "e-services". School systems, businesses, non-profit organizations, zoos, and restaurants are all scrambling to "get on the 'Net".
Even Vice President / Presidential Candidate Al Gore has jumped in with enthusiastic vigor and claims to have "taken the initiative in creating the Internet". In a truly incredible feat, he seems to have created it years after people were already using it. I sent our tech-savvy VP an email asking if he could elaborate on his 'Internet inititives'. Alas, I received only a canned reply - thanking me for my support. "Creating the Internet" is one thing, taking the initiative to answer a simple question is another...
Anyway, you know its a big deal if a presidential candidate tries to take credit for it.
For those of you who remember, the PC revolution, as wild as it was, was nothing like this. The "Internet Revolution" is bigger than the "PC Revolution".
So what does the Internet have to do with the resurging popularity of Unix? As it turns out you can understand the present situation much better if you learn about the past. The Internet has its roots back in the late 1960s and 1970s and has evolved since then. The evolution of the Internet tracks in a path parallel with the growth of Unix. In fact, the first implementation of the TCP/IP protocol was in the BSD (Berkeley Software Distribution) version of Unix.
Since then there have been hundreds of thousands of programmer hours, perhaps millions, invested in TCP/IP and related utilities to run in a Unix environment.
Computer systems and the 'native' networking protocols that evolved with them:
Computer / Operating System | Network Protocol |
---|---|
IBM Mainframes | SNA |
Unix & Unix-like | TCP/IP (same as Internet) |
Novell Netware | IPX/SPX |
Microsoft Windows | NetBEUI |
Apple Macintosh | Apple-Talk |
The Internet requires servers: email servers, ftp servers, web servers, etc. The most natural operating system for this duty turns out to be the same platform on which the Internet evolved: Unix.
The Internet is reason number 1 for the renewed interest in Unix.
Reference: All About the Internet: History of the Internet
Sometimes your worst enemy is your best friend.
This seems to be the case with Unix. The marketing hype/propaganda from Microsoft and others certainly led many folks to believe that Unix was doomed and in short order we would all be "pointing and clicking" our way to glory. But the reality is something quite different.
I doubt if there is a computer-savvy person on the planet who is not familiar with the fragility and frustrating experience called "Microsoft Windows". Excepting, perhaps, tech-savvy presidential candidates. I won't go on and on with "Windows Horror Stories". Rather I will present my thoughts for "why Microsoft Windows is less reliable, less robust, and less stable than Unix-like operating systems".
My thoughts distill down to these:
Each of these will be discussed in detail.
"Unix" was a personal research project started in 1969 by Ken Thompson at Bell Laboratories.
ATT, General Electric, and MIT had formed a consortium in 1965 to design and build a great new operating system called "Multics". In 1969 ATT decided to withdraw from the consortium. Ken Thompson, it seems, was still interested in many of the design goals of Multics, but wanted to do it on a smaller scale. He coined the term "Unix" as a word-play on the much larger "Multics" project.
In 1969 ATT was the regulated telephone monopoly. As such it was forbidden from doing any business other than telephone service and equipment. Perhaps this was a very good thing for Unix, as it allowed this new system to develop without commitees of marketing managers getting involved. For 15 years, from 1969 until 1984, ATT was forbidden from marketing operating system software. During these formative years Unix developed under the guidance of computer scientists and engineers. Their main concern was to develop new technologies into a reliable, servicable operating system.
In 1984 ATT divested from being a telephone monopoly, and ATT was allowed to sell computer hardware and software. But from the mid 1970s until the mid 1990s Unix played an important role in the computer science and engineering studies at many colleges and universities, especially at the University of California at Berkeley.
The Microsoft message is wearing a bit thin, but it wasn't too long ago that Microsoft claimed it would have Windows running on small computers, big computers, pocket computers, toasters, wristwatches, lamborghinis, and vending machines. Windows NT was an important part of that message.
Even before the product was completed Microsoft and gullible journalists would announce "yet another platform that will soon be demolished by Windows". Common folks began to believe this because Microsoft said that Windows NT would do everything that fill-in-the-platform-to-be-demolished did, but better, faster, and cheaper. Of course it was possible that Microsoft was stretching the truth, but nobody wants to call the world's richest man a liar, lest he leave you out of his will. So it was convenient for most people to believe Bill Gates and Microsoft.
Microsoft's marketing may have convinced many people to switch to Windows NT, or at least to look at it. But no amount of marketing can actually make something work. At some point the product must perform.
Microsoft Windows does work - it just doesn't work as well as the marketing hype/propaganda told us it would work. So many computer professionals are taking another look at Unix.
One of the complaints about WindowsNT 3.51 was that it was too slow.
The Microsoft approach to fixing this problem was to move the windowing and graphics code from 'user-land' into 'kernel-land'.
An explanation: Many modern operating systems take advantage of the hardware CPU to operate in at least two different modes. These are often referred to as kernel mode and user mode or "kernel-land" and "user-land".
Code that operates in kernel-land has full access to all memory and I/O ports. This code is not restricted.
Code that operates in user-land is limited to accessing only the virtual memory assigned to it by the kernel. This code is restricted by certain rules that are setup by the kernel and enforced by the CPU hardware.
You can guess that you put the core (or kernel) of your operating system in kernel-land. Make sure this kernel code is well tested because a problem in the kernel will likely halt the entire machine.
User-land code, like all those shrink-wrapped software packages, can be naughty or nice. They don't have to be well behaved. If user-land code tries to do something it should not, then the CPU will enforce the restrictions setup by the kernel and terminate the process (the userland code).
This is sometimes called Crash protection or protected mode operation.
Back to Microsoft: They moved the graphics/windowing code from 'user-land' to 'kernel-land'. They did this because when the CPU switches from "kernel-land" to "user-land" (technically for Intel CPUs it switches from "Ring-0" to "Ring-3") it takes some time. If your code switches back and forth many times, then there is a delay as the CPU switches from Ring-0 to Ring-3. Move "Ring-3" code to "Ring-0" and there may be less switching. The end-user sees an increase in the speed of painting the screen. They say "Windows 4.0 is faster!".
However, the more code in "kernel-land" (Ring-0) the "thinner the ice on which you skate". With 3.51 a fault in the graphics or windowing part of Windows NT would be caught by the CPU, and recovery was possible without affecting running programs. However with 4.0 (and Windows 2000) this same fault will occur in kernel-land, not user-land, and will likely cause the system to fail. This is known as the Blue Screen of Death.
I present this as evidence that at Microsoft "Marketing dominates over Engineering". Moving the graphics/windowing code into the kernel is not a good engineering design. It is a "quick fix" to make users think the update is running faster.
By contrast, GNU/Linux, BSD, and Unix systems run the graphics and windowing code in "user-land". So if your GNU/Linux machine freezes it's graphic display, you need to just telnet it from another machine, kill the X-Windows process, and your machine continues to run, including all running daemons (server processes) without a hitch. (Note that I may have seen X-Windows lock up once, ... maybe).
The design philosophy of Unix and Unix-like systems is different from that of Microsoft Windows.
The unix design is summarized as "spartan". (Remember that the term 'unix' was originated as a 'word-play' on 'multics' which was big, heavy, and complex)
Unix-like systems consist of a kernel plus many utility programs. Each utility program was designed to do just one thing but to do it very well. These programs can communicate with each other through something called inter-process communictions. Computer programmers refer to a program as a set of instructions to be executed, whilst a process is an actual program which is loaded into memory and is executing. In other words, a process is a program that is executing.
By combining multiple do-one-thing-and-do-it-well processes a programmer can create a more complex process. Often this is done by programming in a shell language.
These utility programs that comprise unix-like systems have been used and refined over the past 30 years and are generally pretty reliable and stable. So, if a programmer uses them in a shell program to accomplish some more complex task, it is very likely that these utility programs will function well. Of course, the shell script might have errors in it, but the failure is rarely in the pieces that make it up.
I think of these utility programs as analogous to atoms. Also, I think of the shell scripts as analogous to molecules. There are about 100 elements (with variations of each..) but there seem to be an unlimited variety of molecules that are made up of these elements.
You might think that this "do one thing and do it well" approach to building a complex system is a good one. I think it is - from an engineering standpoint.
So did Microsoft follow the same philosophy when they designed Microsoft Windows?
It appears not. Previously I said that Microsoft's motivating force for their Windows was dominated by Marketing, not by Engineering. Here is further evidence.
Rather than build a system similar to Unix, where various pieces can be removed and replaced, Microsoft has build a system with many artificial inter-dependencies. Apparently this is done to further herd the marketplace to adopt more Microsoft products.
Microsoft provides excellent evidence to back me up:
Look at the highly publicized case of the "US Department of Justice versus Microsoft". Much of the testimony from the Microsoft witnesses concerns how Microsoft Internet Explorer (MSIE) is an integral part of the Microsoft Windows Operating System. They present proof in the form that "if you remove MSIE from Windows you are left with an unstable and unreliable system.
The US-DOJ argues that MSIE is an application which is illegally bundled with Microsoft Windows. They present proof in the form that "you can remove MSIE from Windows in 90 seconds or less".
Arguments fly back and forth, and ultimately a Judge will rule.
However you can bet that the next version of Windows (Windows-2000) will have MSIE more tightly interwoven.
To contrast with Unix-like systems, it is easy to add, remove, replace or even have different web browsers co-exist on the same system - even running them simultaneously.
When Microsoft builds artificial inter-dependencies into their Windows they make it less stable and less reliable.
Microsoft doesn't stop with MSIE. They are working to provide even more artxificial inter-dependencies in Microsoft Windows-2000. This will likely make Windows-2000 even less stable and less reliable, and drive more people to consider Unix, GNU/Linux, and BSD systems.
Imagine that you have two identical computers with identical operating systems on them. Each is used by a different person who is an expert at using those systems. After several days/weeks of use, the machines are configured differently - to suit the needs and preferences of each user.
Question: What makes the two systems different?
Answer: Configuration files.
Unix configuration files are all in plain text format. This means that they are a series of printable characters with each line terminated by a "new-line" character. The advantages to using plain text configuration files:
Some things that administrators find frustrating about unix-like configuration files:
When your computer gets "out of sorts" its generally no fun. With unix-like operating systems you usually must understand what is wrong, know configuration file is responsible, understand the internals of that configuration file, edit that file, and restart the process that depends on that file.
This requires:
Microsoft Windows saves all of its configuration information in binary files (they cannot be edited with a text editor, such as 'notepad'). Further, the format of these files is proprietary and subject to change with each new version (and sometimes service-pack) of Windows. This makes it very difficult for any third party developer to build a tool to help you edit this file reliably.
The proprietary binary format of the registries means that it is impossible for most computer users, administrators, programmers, and professionals to detect and repair corrupted registry files. The best you can do is re-install the operating system and all your applications.
If you run Microsoft Windows you get to choose from the following windowing systems:
Almost no software relies upon it.
If you run Microsoft Windows you get to choose from the following command shells:
Microsoft often points to the many varieties of unix and shouts "Bad Thing!". Well, now that we've all been burned by their Microsoft Windows, lets pause to actually think: are the different varieties of unix a bad thing?.
Maybe not.
The different varieties represent different groups - some commercial and some not - to further refine and develop unix-like operating systems.
BSD is the Berkeley Software Distribution version of Unix.
The logo for BSD is the BSD Daemon
The BSD version of Unix is quite extensive. The FreeBSD operating system includes a text file called the bsd-family-tree which provides an overview of the derivatives of BSD.
The Berkeley License is an important software license. NetBSD, FreeBSD, and OpenBSD are licensed under the Berkeley license. This license is a neutral license, as it allows anybody to use the source code, in any manner they wish, providing that they provide an appropriate copyright notice on further distributions.
This type of license is favored by groups who want to promote a particular standard.
Some examples: the Apache Web Server is licensed under the Berkeley license. This allows software companies to use the code in proprietary commercial products. Any such products will help further promote the use of standard HTTP because that is the HTTP implemented in Apache.
If you are a big greedy software company who wants to make the Internet run on your own proprietary protocols, then the Berkeley license is not for you!
NetBSD was the first free software "spin-off" from Berkeley BSD.
Reference: http://www.netBSD.org
Reference: http://www.FreeBSD.org/
Reference: http://www.FreeBSDmall.com/support/
A spin-off from NetBSD, OpenBSD emphasizes security. Code development takes place outside the United States in order to avoid trouble with US exports on encryption. The project is based in Canada.
This is an example of how the design of an operating system affects its security.
The OpenBSD project builds strong encryption into the kernel and also with other utilities. This encryption is stronger than what is legal to export from the United States. Fortunately for Americans there is no law restricting the import of strong encryption. OpenBSD is legal and available for US Residents.
This is an example of how the implementation of an operating system affects its security.
Some of the vulnerabilities of operating systems on the net are due to buffer over-run conditions. The source code might specify a maximum buffer size of, say 100 characters for a name, for example. A clever but bad person might push 110 characters as a name, with the final 10 characters having some special binary code meaning. The executing code overflows the buffer, valid binary instructions might be over-written with the outside 10 bytes, and the system may become compromised.
To reduce these vulnerabilities, the OpenBSD project has worked to audit critical parts of their code. When these vulnerabilities are found, they are fixed.
Reference: http://www.OpenBSD.org
The GNU project uses a "gnu" as it's logo.
GNU means "GNU is Not Unix", and is a recursive acronym. If you ask "What does the GNU mean?" you will get the reply "GNU is Not Unix"
Yes, I know it is not Unix, but what does the "GNU" mean?
"GNU is Not Unix"
That is what is meant by a "recursive acronym".
The story of GNU is very interesting. I recommend that you spend some time reading the pages at http://www.gnu.org to get the story right from the horse's mouth.
This short description is lifted from the META description of the GNU homepage:
Since 1983, developing the free unix-like operating system, GNU so that computer users can have the freedom to share and improve the software they use.
GNU code is of very high quality. In addition to that code, which was created by many talented volunteers, Richard Stallman, the founder of the GNU project, created the GNU General Public License (GPL). This is a very important license because it provides the software user with the right to always have access to GPL software source code.
The GPL requires that once software is GPL all subsequent distributed versions must also be GPL. This license encourages software development from many sources, and discourages companies from making proprietary enhancements to it. If you are a big greedy software company who wants people to pay upgrade fees for life, then the GPL is not for you!
The Linux community uses a 'stuffed penguin' as their logo.
The distribution of GNU/Linux is different than that of the BSDs. While there is only one distribution of FreeBSD 3.1-Release there are usually several distributions of any particular version of GNU/Linux.
Redhat is a commercial enterprise which has been successful in capturing the attention of reporters, editors, and newcomers to GNU/Linux.
Reference: http://www.RedHat.com
Here is a description of the GNU/HURD which I lifted from the GNU Hurd Information web page.
The GNU Hurd is the GNU project's replacement for the architecture-independent services provided by the Unix kernel. The Hurd is a collection of servers that run on top of a microkernel (such as Mach) to implement file systems, network protocols, file access control, and other features.
Check the GNU/Hurd Information web page (link above) for the latest available code and comments on limitations and bugs. If you have the time and inclination this might be fun to try out. (A lot more fun than struggling with Windows-2000 beta 3)
If successful, the GNU/HURD will be a next operating system which will be a real alternative to unix-like kernels such as Linux and the BSD kernel. I think this will be very exciting!
Using the Internet, innovative people have created automatic surveys to cut through marketing hype and determine exactly who is using what
Take a look at how the busiest site on the Internet does their web serving, and compare that to how Microsoft does it.... Very Enlightening.
You don't actually 'buy' software - you 'license' it. Your license agreement controls how you can use your software, for example
What is free software? What is open software? Is it a radical idea? Is it good to use? Links to some well written thoughts...
Join the discussion about this topic. Ask Questions! Make Comments!
You may include HTML markup tags in your message.
001 1999-03-31 16:35 EST Marty Cawthon Basis of the talk
001
1999-03-31 16:35 EST Marty Cawthon mrc@chipchat.com
The basis of my talk will be my direct hands-on experience with GNU/Linux, Unix, BSD, and other systems.
There is a wealth of other information on the Internet regarding other people's experiences. I have put some links on this web page and will continue to do so.
If you leave questions or comments here you will provide information which will deliver a talk that is more useful to you.