Aug 032011
 

The very first proper graphical browser ever released was NSCA Mosaic, a browser for Unix on the X-window system it was released less than a year after the first Linux kernel was released. Within a few years the browser was comercialized and became the netscape browser. Gaining a port for Microsoft Windows 95 it became the first browser of choice on that platform and while the Unix version persisted it was rather neglected in those years. 

Netscape on Windows took a major hit when Windows 98 shipped with internet explorer built-in, and around this time something else very importantly changed in the Free Software world. It was in this time that Eric Raymond wrote "the Cathedral and the Bazaar" the first study of free software as a technical development methodology rather than just an ethical stand-point. While there is nothing wrong with such a study it would ultimately lead to a rift between those whose primary motivation was ethical (like Stallman) and those who cared only about practical advantages.

Back in 1998 however, this was hardly an issue yet and it would begin a major change. Raymond's book caught the eye of Netscape's executives and it was from one of them that the phrase "open-source" came. Netscape saw a shot at survival there and began a process to release their source code.

The initial model had netscape kept as a commercial browser with extra features built on top of the structures produced from the open code-base they called Mozilla. Mozilla in those days was a much needed tool – and every developer's nightmare. The codebase was notoriously ugly, absolutely gigantic by Linux standards and frequently very badly written and designed. The mozilla foundation and their largely volunteer developer base knew it would take years to refactor and improve that code, and thus chose to release incremental process through what was known as milestone releases.

Most hardcore geeks would regularly download and build the latest mozilla milestone, a task which on the computers of the time would easily ran overnight and into the next morning – and was fraught with risk, many times it would fail after many hours due to a new bug that had not been picked up upstream. 

Distro's would get the best build they can and include those and then released binary builds incrementally as updates on a much slower pace (mostly because it was jut not practical to do more) and this is what we browsed the web with. By then IE6 was the dominant browser on the web, and mozilla's market share was tiny – the result was that most web designers simply didn't care to follow standards (especially since neither browser did that well) and just developed their pages to work in IE. Even major banking and government institutions often didn't work. It could get worse, at one point South Africa's independent electoral commission's website did a browser-check and if you were found not to be running IE6 would redirect you to a page instructing you to get "a real browser". Alistair Otter of tectonic news had a field day with that one ! 

That state would last for several years as the mozilla code-base was improved and slowly trimmed down. Mozilla as a browser would not ultimately survive but from it's core code base and gecko rendering engine would ultimately come Mozilla Firefox (a side story – this new browser project designed to be a fast low-resource browser using the gecko engine was originally called FireBird, but the then long established developers of the free FireBird SQL server complained about this bad form among fellow FOSS developers and the name was changed – this was the start of several such events, later issues about trademarks from Firefox led to many distributions making minor forks that only changed the name and branding – debian being the first, and ultimately the FSF themselves created a major fork stripped of non-free bits and it's own addon site that only allowed free software addons – this version is called IceFox) , from Netscape Mail would come Thunderbird, and even Eudora persists.

In fact the list of Mozilla based applications has gotten quite huge and the mozilla corporation is one of the flagships of the free software world today.  Looking at the ease with which we can load up multiple standards compliant free browsers today (Google's Chromium, Firefox etc.) and be quite confident of almost never finding a site they can't handle is a stark contrast to those early days when we frequently felt the annoyance of a website refusing to play nice, when most of us changed our user-agent to try and convince the sites we were running IE6 so we could get around browser-filters and when getting an update for security and features meant downloading a source package larger than the kernel with a hugely complex (and atypical) build process that was truly fragile and took a very long time even if it work.

But we did it anyway – and it paid off.

Aug 022011
 

Doom the game is almost as old as Linux – the first version came out in 1993 and brought with it a significant number of features and ideas that would become the cornerstones of the first-person-shooter genre. Prior games in the genre such a Wolfenstein lacked it's true 3D view and spatiality, network play and such features. 

Doom predated 3D graphics cards by many years and with this in mind what it could do with a simple 486 CPU under MS-DOS was all the more impressive. Doom also brought another major innovation – the WAD files, which allowed players to create their own maps – over time this would go from an obscurity to a cornerstone of the gaming world – companies learned that encouraging game modders increased sales and kept players loyal to their games for years rather than weeks.

But doom would end up taking it all another step further. A few years later ID open-sourced the engine. First under the doom source license but in 1999 it was relicensed under the GNU/GPL. It was the first ever major commercial game to have it's engine open-sourced and it renewed sales. Suddenly many programmers with fond memories of the game were porting it to new OS's and and platforms (there was even a version for the earliest Nokia smartphones) . ID however kept control of the data files, so these doom source ports created a market for a game that was long unsellable. Suddenly people were buying it again – to play with the new engines.

At the time of the release John Carmack wrote in a note in the source code that the decision was made to show others how this groundbreaking game had been created. He pointed out areas in the code which was obviously done badly (as seen several years later) but of course, since nobody had done anything like it at the time, had seemed sensible when they wrote it. For example doom actually rendered the entire room, later games learned to spare resources a bit more, and only render what is in a player's field of view. 

The doom release was so successful that  ID made it standard policy to release game engines a few years after the game was out, for hobbyists and retrogamers to keep alive. They don't want to do it on-release because they make quite good money from licensing their engines to other companies that specialize only in the design parts of game development, but once the engines are no longer absolutely cutting edge they all become free software. Quake was the second to follow, and all the subsequent quake releases have done the same (I believe Quake 4 is not yet free software but it's time will come too).

This created two other revolutions in the GNU/Linux gaming world. Firstly budding game designers would use these older engines to build their own datasets for – often clones of the originals, allowing players to play them free of charge and learning valuable skills in the process, secondly budding game engine coders could use them as reference code to learn from – and that gave us several quite amazing free game engines – of which CUBE was among the first.

Jul 282011
 

Finger despite the naughty name was in many ways the precursor of modern social networking sites – facebook statusses and twitter both have have significant similiarity in both their purpose and their use to what finger once was.

Unlike these sites however, finger did not rely on any central database, finger was a unix (and linux) concept comprising three pieces beautifully working together. If you remember my earlier post about mail systems then you will see how it was used before the internet in the same way, but with the internet finger became network aware.

The first part then was fingerd, the server daemon which a unix machine could run to allow finger to function. It listened for requests coming in from the finger command, and responded to them. Finger ran on a privileged port – so it had root privileges (for it's operation, this was important as well) and the finger command took as it's parameters the host to contact (or the local machine if none was specified) and the username of the person you were checking.

Upon receiving a request the finger daemon would look in that user's home directory for a file called .plan – and if found, would send it's contents back to the finger client, which would then display it.

So users could write status updates in .plan, and other users could read it by fingering them. The crucial difference between this and later social network statusses of course is that it was demand based updates only, using a pull technology as opposed to the push-to-all-followers approach of modern social networking systems.

Finger was used professionally (employees would write about what project their working on so their manager could quickly check what everybody was busy with), socially (much like twitter is used now) and even as a marketing tool. For many years ID founder John Carmack kept his .plan on ID's public Linux server regularly updated and gamer geeks avidly followed it to keep track of development on the current work happening in the latest doom and quake installments. One of the most famous updates he ever posted was right after the doom 1 source code was first released under the GPL in the very late 90's – it concluded: "I did it because Linux gives me a woody*".

Finger however was designed in the early days when the internet had no need to be secure. It's original designs were, frankly quite attrociously insecure. Later versions improved a bit, for example dedicated finger daemons could give up root privileges once they had the port (though this relied on .plan files being world-readable it was better than having a root-privileged user randomly scrounging in people's home directories) but even then it was a high risk service to run, a brute forcing client could easily use finger to get a list of valid users on a server, which then meant that it could limit it's password attacks only to users that actually existed.

Companies like ID that deliberately ran public finger servers kept them in DMZ's later, but ultimately the program went from a default installation on any unix server to an obscure tool that was rarely used since hardly any server had it, none enabled it by default and no prudent sysadmin would willingly enable it unless he was very sure of his network security on the other end of the link.

So finger went the way of the dodo because people were concerned about it's privacy implications, and because it left one open to attack by cybercriminals. Ironically neither twitter nor facebook has relieved either of these concerns (if anything it has aggravated them) but the internet community of today is not the community of the late 90's when it first started to swell and things like corporate internet connectivity began to go mainstream – it was in that era when most people's internet was still being managed by skilled sysadmins sitting between them and the net that finger for all practical considerations died, it took nearly ten years for twitter to replace it. 

There are still finger servers and clients in development today, and simple ones are frequently still used as the default "how to write your first network-capable program" example in many text books but the phenomenon it once was is long over. Nonetheless it did leave a legacy,  the memory of it remained – and a new generation of programmers would remember that people had once enjoyed looking to see what their friends were up to – and bring that back to the internet in a new form.**

*In case it wasn't obvious, this was probably the most polite slang for an erection he could think of.

**I have no evidence that either twitter or facebook's developers ever used or were consciously influenced by finger, though I'm sure they would know about it – this is not meant to be a claim of direct origin but rather an extrapolation of a basic idea, invented and reinvented. Whether the reinventers knew, or cared, that they were not the first doesn't change that they weren't.

Jul 262011
 

While RMS was never a fan of vi the reality is that vi and emacs are not all that different. Both work on a principle known as dual-mode editing, where you have one mode for actually entering text and another for manipulating it. This is a very powerful way to work (if not as simple as the one-mode-only approach of most graphical text editors) as the command-mode used for manipulating could have an almost programming level of power to rapidly do mass-changes of all sorts throughout the text.

In emacs this came from lisp scriptability, in vi it came from the command-line at the bottom of the editor (accessed by hitting Escape then Colon). VI dates back to even before unix itself, it's earliest incarnations having been present on systems such as the PDP and multiple operating systems including VMS and Multics had it. The original VI was not free software but free software clones soon appeared. Two of them would go on to survive into the world of GNU/Linux – elvis and vim. Elvis was the more basic version, barely more than the original 35 year old ancestor it's just a basic vi-like editor but it still survives in a niche for small setups. 

For example – slackware has elvis in it's tiny install-phase environment. VIM is the big brother of vi clones however, VI-iMproved indeed it grew features upon features upon features while retaining it's small footprint, superfast and very powerful heritage. VIM as of version 7 even supports such modern programmers features as syntax highlighting, code-folding and auto-indentation. 

The editor's command-mode makes it almost a full IDE in one simple console program and to this day it's a favorite among Linux programmers and sysadmins – but for newer users who came into Linux in the desktop world it's at best an archaic curiosity. This is at least part of why most distributions no longer install it by default – but I've yet to see any major distribution that didn't have it in their repositories. It's just to ubiquitiously wonderful to work in, and too popular with geeks not to be there.

What really says the most about VIM's true power is how it managed to hold it's own to this day among programmers – even with stiff competition from the likes of eclipse. Generally if you're the kind of programmer who grew up with, and likes, systems like eclipse you won't be able to stand working in vim as it will feel primitive and cumbersome – but the reality is that those of us who learned with vim find exactly the same thing with graphical RAD environments like eclipse and kdevelop. Their clunky, the menus get in the way, the mouse is an annoyance because we have to break our keyboard focus to use.

Some like to click Edit|Search & Replace but some of us loves the sweet power of [Esc][:]s/Searchterm/Replaceterm/g

That's right – regular expressions built right into command mode. But it gets better: want to quickly delete the third word on every 12th line ? VIM can do that, can you do it in eclipse ? You may think that such an esoteric feature would be barely used but because it's command based it can be modified in any variation you can imagine – and that gets used all the time.

If you've never learned vim I would heartily recommend it, it may be old-school but it's still rock'nroll to me.

Jul 042011
 

I have to admit that I have never been a huge fan of emacs as a program – as the classic joke goes, emacs is a wonderful operating system, too bad the editor sucks. Having said that, emacs cannot be underestimated for it's incredible contribution to computing – it sparked a revolution that is still going on and from a historical perspective it may be the single most important piece of software ever written.

EMACS had it's origins while a young Richard Stallman was working as an operating system developer in MIT's AI lab, originally written as a developer's text editor for the ITS operating system, Stallman had written the program in a high-level language making it portable and since the program and source code was freely available it very rapidly became extremely popular among academics working on various operating systems – in particular Multics, VMS and Unix. 

Part of what made it so popular was that the program was developed at the A.I. Lab – and had exceptional support for LISP – the AI language of choice (a language which also happens to be almost identical to lambda calculus – and one of the strongest arguments in favor of the idea that software is just mathematics). EMACS didn't just support lisp – it was extensible through lisp and before long some very powerful extensions would come to be written. A web browser and an e-mail client would ultimately be included and there was even a powerful LISA program that sounded like a Freudian psychologist and really could fool people for a while.

It was there, that Stallman did something incredibly important. In deciding how to deal with all these extensions – Stallman came up with the EMACS-community agreement. This document, simple as it was, was a truly crucial step in history – because that agreement would later become the model for the GNU/GPL.

That was emac's biggest contribution to the history of free software – in a very real sense it was the first ever program to be copylefted. While the academic notion of sharing was alive and well in software from it's inception right up to well after EMAC's was first written – EMACS took that notion outside of just academic circles into the wide world of software as a whole and formalized the ideals with a written (if simplistic) agreement. 

It was the start of what would become the free software foundation and the GNU operating system – one variant of which I am using to write this post, and the foundation of the GNU/GPL – the license that governs the vast majority of free and open source software including the MySQL database behind this site and the wordpress platform that is showing you this post – if you are using firefox to view it then your browser is under a license directly derived from the GPL, if you're using Chromium then it's under the LGPL – and even closer derivative.

That's a pretty impressive legacy. But the emac's legacy doesn't end there. Shortly after leaving the AI lab to begin his work on the GNU operating system,  Stallman recognized that quite a lot of potential users of his program at that time 1983) didn't have internet access to easily download emacs. So he packaged it on tapes (then the most widely used portable mass-storage system, equivalent in the computing era of the time to memory sticks or rewrite-able DVD's today), and sold those tapes for a nominal fee.

This was the first ever free software business. With it Stallman proved that not-only was free software an ethical force – it created lucrative market opportunities that could be profited from. It paid his bills and much of the early GNU development costs for quite some time (the first of several FOSS businesses ran before a MacArthur Genius Grant removed any need for him to ever earn a living again so he could devote all his time to the FSF).  This business of collecting and prepackaging free software was a fore-runner of the later business model of creating free software distributions to be sold – which in the early nineties was what gave birth to the GNU/Linux distribution concept – a concept we all still use today. Ultimately Ubuntu has it's roots in Debian which alongside RedHat, Slackware and a few others were doing exactly that to make money in the early parts of the 1990's when internet access was still rare and expensive.

It's still the primary business model for Slackware in fact, and devoted slackers usually buy a copy of the distro (even if they have already downloaded it) to support Patrick Volkerding's work.

And that was emacs. It's history was not without controversy however, it was also at one point at the heart of one of the most unsavory bits of FOSS history. Major projects forking is a very rare thing and when it happens – mostly one fork completely wins out, in rare occasions you will find both appealing to strong but non-overlapping groups of users and existing for a long time in conflict. There were only really two major times this ever happened and Stallman's software was involved on both occasions. EMACS and XEMACS was the first – and Stallman still doesn't like the mention of the latter, the other one was gcc vrs egcs. The latter after several years worked out their differences, the egcs code was folded back into gcc and it became one project again. The former one was a bitter feud lasting nearly a decade before xemacs ultimately died a quiet death.

EMACS is still around and included in the repositories of almost every distribution though almost none include it in their default installation anymore. It's a huge program (quite possibly one of the largest text-only programs every written) and a dedicated group of users (mostly developers) continue to deem it their editor of choice but it's days as one of the mainstream programs in the Unix/Linux world is long over in this day when hardly anybody uses a text-only text-editor anymore the spot for the occasions when you need it was won mostly by VI in such incarnations as VIM and Elvis – or for the ubuntu users – the default text editor on their distribution which is nano – itself a clone of the older pico editor which was once created as an easier to use text-editor than the standard options of vi or emacs.

Stallman is quite unbothered about VI, though he is no fan of it, it holds a prime spot in his St. Ignutius of the Church of Emacs sketch when he states that using a free version of VI is not a sin in the church of emacs but a penance.

Jul 012011
 

I have decided to begin a series of blogposts in which I will detail some of the wonderful free and open-source software of the past, software that us FOSS guys once used on a day-to-day basis but which have fallen by the wayside, some that explored truly wonderful ideas before their time, and other gems of the now nearly 30 year history of free software. Most of the programs I will talk about are things I used heavily myself in the past. Those newer to this world (e.g. everybody whose first system ran ubuntu) probably never heard of most of them.

Many of us old-schoolers still use some of these on a daily basis – you may or may not be among them – but it's worth recording where we come from, what we learned, and who taught us – on such knowledge is smarter futures built.

The reason for the listing is a bit of credit to the visionaries who helped build the base for the FOSS world of today, I'll look into why those programs were considered great at the time – and their lasting impact into the FOSS world of today. Here is my initial list of programs to cover, I may expand it if I think of more during the course of the series:

  1. Emacs – the program that really started it all
  2. 3DWM – Blue-sky dreaming of a better interface world
  3. Sendmail and Fetchmail (done as a single post although their histories were not directly related)
  4. DEVFS – Richard Gooch the unsung hero of paradigm shifts
  5. Midnight Commander – The app no sysadmin wanted to be without
  6. VIM – Because some of us didn't like emacs
  7. Finger – before twitter and facebook, we did status updates like this
  8. Doom – It was not free software to start with, but when it became so ID changed the face of game development with it for the second time.
  9. Netscape and Mozilla-Milestones – Before firefox, we browsed like this.
  10. ION – The radically new idea that never took off, or did it ?

The series will be tagged with "foss_archeologist" and will start with the first post on Monday. In the meantime, if you have suggestions for amendments-  please feel free to list them in the comments and I'll consider them.