A Brief History Of Linux
Long long ago (around fifty years, to be precise)...but not in a galaxy far far away (right here on Earth, in fact) was once an effort to create an operating system called Multics – the Multiplexed Information and Computing Service. You see, until then there were no 'operating systems' as such which (the way we refer to them today) could be used from one machine to another; a specific set of execution code was written by each and every hardware manufacturer, specific for their own piece of hardware. Then around the 1960s, Massachusetts Institute of Technology (MIT) in association with Bell Labs, tried to come up with a new approach with Multics. Cross-hardware compatibility wasn't they only thing they were aiming at; for Multics was to incorporate time-sharing – allocating specific slots of CPU time to multiple programs which needed to be processed – in contrast to what the hardware manufacturers' code generally did – batch processing, that is, doing a single task till the end.
Soon however, the cost for the Multics project started spiralling with trouble being faced in implementation and delays happening – so eventually, Bell Labs pulled out of the project in 1969. Some people at Bell Labs wanted to keep at least a part of the project alive, took whatever development had been done, and formed an operating system called Unix – a take on the name 'Multics'. Unix became quite popular, being widely used at that time – even running the very first servers of the nascent Internet.
Fast forward to the 1980s, when the Richard Stallman-founded Free Software Foundation decided to start the GNU project (an acronym for GNU is Not Unix) to create a free software Unix-like system. Work progressed well on this, with the GNU project creating a lot of programs which went on later to be widely used by other developers – like the GNU C Compiler (GCC) and the Emacs text editor. One of the other major developments at this time was the creation of the GNU General Public License – a free software license which gave the user the freedom to view, modify and distribute program source code; under which many free / open source programs are released these days.
The GNU project was very successful, and almost succeeded in creating a full operating system – almost being the keyword here. You see, GNU hadn't had much luck in developing GNU Hurd (a take on 'a herd of gnus' – gnus are a kind of deer-like animals found in Africa). GNU Hurd was to be the kernel for their operating system, and its development failed to take off. It didn't attract much developer enthusiasm, and to this day, isn't developed enough to be used in actual computing environments. What's a 'kernel'? A 'kernel' is one of the lowest levels of an operating system – the bits of code which 'speak' to the hardware, schedule processing tasks, manage hardware drivers (programs which enable your hardware to run) – basically, interpret the commands which the user gives through the programs he is using, and implement them.
Around the 1990s, there was a student in the University of Helsinki (Finland), called Linus Torvalds. At his university, a Unix-like system called Minix (developed by Andrew S Tanenbaum) was used. However, a Minix license was quite costly, so Torvalds got down to writing his own kernel, using tools like GCC created by the GNU project; so that he could have a Minix-like system running on his home computer. In 1991, Linus Torvalds finally released this kernel, called Linux. Initially, it was supposed to be named Freax – a take on the words 'free' and 'Unix', but he was ultimately persuaded against it.
The Linux kernel became quite popular at that time, and soon started to be widely adopted. This was the missing piece of the puzzle everyone was looking for, soon people started using the Linux kernel in conjunction with the rest of the GNU software set. Thus was born, the first step towards Linux distributions. With it, it also sparked off a naming controversy. Linus Torvalds and most developers called the combined 'kernel + GNU' as simply 'Linux'; while the Free Software Foundation thought that it took the credit due to it away, and wanted to call it GNU/Linux. The debate rages to this day, with some of the more staunch free software supporters like Debian choosing to go with the latter convention, while most of the community simply uses the former naming convention.
Everything wasn't quite set yet though. In the initial days, nobody below the level of an advanced techie in Unix systems would have been able to get it up and running, since there were no 'pre-compiled' packages. People needed to download the source code for the kernel and various other software needed to run the system, compile it, and then have a working system. The problem was compounded by the fact that there were no automated systems then which would detect which software would be needed to get a whole system running. Since this method was obviously very tedious, the community rose to the occasion. Soon, there came out Linux 'distributions' – or 'distros' for short. What these people did is, they compiled the Linux kernel and the accompanying software required for it, and put it up on the Web for all to use. There were many such distros in the beginning, but one of the oldest major distros which still exists to this day is Slackware.
Over the years, Linux has evolved with many groups making their own distribution – each with its own feature set, and targeted towards a different audience. Some are for more technically capable users, some aim at the mass market – and some aim for the student community. Indeed, with more than 300 Linux distros to date, the future does seem bright for penguin community. Oh, we forgot to tell you that? Linux's mascot is a cute penguin called Tux.