Oct 282014

If you ask any geek about his browser, you'll get one of several answers, but if you ask about addons
there is one consistent theme: all of them use some kind of adblocker. Technically savvy people don't
see adds on the web anymore, and generally this has made them much happier browsers.
It has also reduced their risk of spyware and other malware infections.

So far so good but could there be a downside to this ? Not seeing ads means most engineers don't
see how targetted they've really become, don't experience the amount of data collection that 
this reveals – and thus have no itch to scratch on the underlying data collection itself.

Private companies collecting data to do targeted marketing have been shown not to be trustworthy
with that data, we know they've been happy to sell it to third parties – including governments
and government agencies like the NSA.
Some geeks have been warning about this for ages – Richard Stallman predicted it in 1983,
30 years before Edward Snowden revealed it as happening and the organisation he started
to fight for free software was in part motivated by trying to prevent this risk.
It is still one of the organisations on the forefront of fighting to reclaim our privacy with 
projects like diaspora and mediagoblin (which I wrote a short piece about last week).

But for some reason, even now, after Snowden's revelations – these FSF projects aren't getting
mainstream traction among geeks. There is still not enough drive to end them. It's becoming
ever more clear that there is no political solution to this issue – yet the technical ones
are struggling due to a lack of contributors.
Many of the very best engineers are actually working for the biggest culprits ! 

Why is this ? Why do engineers not feel the need to contribute to, make use of, and drive
technologies to end this corpo-government intrusions into our private lives ? I think in 
part because even good things can have unintended consequences. It's just possible that
unlike everybody else – the one group who can appreciate the visible evidence of data
collection and infer the scale required to do it, are not seeing that evidence because
years ago they started blocking the channels it exists on (since those channels are annoying).

Now I would never advocate that we stop using adblockers, if anything, I would advocate that
we should get them more widely used (if enough people use them – the advertising market would
collapse and a lot of the monetary reasons for data collection would dissapear) – but in it's
current state as something mostly used by tech savvy geeks and engineers, it may actually be
having a negative side effect by making those most capable of finding solutions to these issues
less aware and less motivated to do so.

So, no, don't uninstall your adblocker, but remember why you wanted it in the first place and
help us bring about a new true peering internet. Let's contribute to the FSF projects fighting
to change the way people share things online so that, once again, the users can control what
they share with whom. 

Dec 102010

Yesterday I saw a surprising post on twitter in which Ivo Vegter was being extremely critical of Julian Asange and wikileaks. This surprised me as I would have thought that Ivo's strong libertarian leanings would make him a fan of wikileaks. So I asked him what he thought. 

He replied (Note that I am very slightly paraphrasing to keep the meaning clear out of context): "Wikileaks is anarchistic I am not. Their actions are extra-legal and undermines government's ability to protect life, liberty and property".

I chose not to answer that at the time, rather just to thank him for the clarification – because I wanted to think about it. In the end I return to the opinion though that we need wikileaks. The idea of wikileaks isn't actually new – sites like it have existed (and posted more revealing works than they did) for as long as the internet has been around. What Julian Asange did however was the one thing none of the underground sites achieved: he got famous.

Now I saw Penguin Pete's statement that this is a bad thing – but I disagree. What Asange's fame did was to bring these secrets out into the public consciousness. No longer are they just something that gets posted, and a few people read – and the newspapers ignore. Now they are big news – the newspapers are under pressure to publish – and the promise of the internet comes true – everybody knows.

I disagree with Ivo most strongly, because I never thought the government was all that good a way to protect life, liberty and property in the first place, and in fact many libertarians do agree – going so far as to believe that police forces should be disbanded and private security companies do law enforcement.

My old love for cyberpunk however is heartened by these events. The internet was built by anarchists, the old school hackers (in the original meaning of the word) – share and share alike, a voice for everybody, no central control, no government, nor rule or authority – and proved that a system like that can in fact work. The original social network (slashdot.org) managed to build a powerful community in the complete absense of rules by a simple system of community-self-moderation proving that there are always enough voices of sanity who will promote other sane voices so that the trolls and flamers get drowned by the noise. 

But then slashdot was never afraid of somebody saying "fuck" on their forums.

It all started to rather fall apart. The more popular the net became, the more it's focus shifted to sites where -rather than anonymity we were forced into revealing too much – the facebook corporate 1984. Instead of anarchism we became slaves to the authority of the companies behind those sites… 

Wikileaks is not a new phenomenon but rather proof that the old-school hackers are still around. The most recent cables are showing proof that the state department sanctioned and in fact helped cover up a party where Dynacorp made boys aged between 8 and 12 years old available to potential business partners in Afghanistan – to be raped, as a tool for business recruitment !

This is the kind of things that are happening in the world. It's what governments are really spending our money on. 

So unlike Ivo I do not for one moment think that wikileaks is harming governments ability to protect rights, on the contrary the culture of enforced openness, of no-more-secrets  has one guaranteed effect: forcing them to actually do that. In a world without cover-ups and classified cables, governments cannot sanction and cover up the rape of young boys. Governments can be the worst violators of rights there are. The rights of their own citizens and of other countries (if they are effectively ruling the country during a state of war then they'll do it worst of all – as the USA is in Afghanistan now). 

What wikileaks have done is to make the revelations of these crimes impossible to prevent- and in so doing it has done more to protect our rights from our government than any amount of oversight committees (who themselves are bound by classification) ever could. I don't trust corporations – and I don't trust governments. The executive branches of governments are the least trustworthy of all -and most importantly, the money they violate rights with is my money. Our money. We have a right to know what that money is spent on. We sanctioned it's expenditure on the protection of our rights, and the provision of those services which are best done on a "profit for the community rather than the provider" basis. We have a right to audit every penny because it's NOT their money. It's ours.

It may be illegal – but it's not immoral and frankly it shouldn't be. I would not let my stockbroker invest my money without telling me what he invests it in – every cent. Why the hell would anybody imagine the government should get to spend our money without accounting for every penny ? Without us knowing exactly what we're paying for ? When you're paying to have the rape of young boys covered up – you have a right to find out about it, and the people who risk their lives and liberty so that you may know about it are heroes.

Finally – I believe that the less secrets there are in warfare, the more civilized it must by nature become – and ultimately if we're lucky it becomes impossible. Whistleblowers end wars. I still say Asange should get the next Nobel Peace Prize. He's done more to end the Afghanistan war already than Obama ever could.

Nov 052010

So Fedora 14 coming out meant I wanted to try it. I've been F13 on three machines so far: my work laptop, my media-player machine and my gaming desktop. On my work laptop the upgrade went smoothly and it runs beautifully, the reasons why I first switched to it (resemblance to the RHEL systems running on the servers) still apply – and I have gotten pretty adept at Fedora's little quirks so I'll keep it there- it works wonderfully in the office. The media machine is barely affected by the choice of distro because once set-up the only software on it that matters much is XBMC so I won't be installing any upgrades on it soon – it's not like it's ever going to be at risk of security breaches – all it does is play movies.

My gaming desktop however was another matter. From Fedora 13 to Fedora 14 there was a regression in usability on the kind of setup I got there that was so extreme that I couldn't bear it. Upgrading failed miserably leaving the system barely functional so I did a clean install… and the problems didn't go away (I suppose not using the live media made it harder but Fedora's design means if you want to save bandwidth by reusing your download you already did you can't do so with live media at all) – either way, the nouveau driver while coming along nicely is simply not good enough at the primary task (accelerating 3D) yet to use for gaming. Bugger. That's where things got hectic. It took hours of figuring out and googling to get the nvidia driver to work at all – and then it would only work on one screen at a time – so much for the lovely dual-screen setup I've used for nearly 3 years now ! 

Fedora's pulseaudio has been my biggest annoyance with it ever since F12 as I still think pulse is a solution looking for a problem, not finding it, and thus creating a whole bunch of new ones instead. Fedora 14 however proved to be a massive headache on every level. I don't much blame Fedora for the nvidia difficulties – that's nvidia's fault for not having a free driver, and the third-party packagers for doing the worst job they ever did with it, but yum and packagekit reached new levels of broken integration, the upgrader originally didn't bother to update my repositories (not even the official fedora ones) to know I've changed releases… basically I'm sorry but F14 is the worst desktop release Fedora ever did and it made it completely useless for my home desktop. It seems to work fine for the business oriented usage of my laptop however, if that's all Fedora developers care about, then it's all I'll use their work for.

By 10pm last night I was simply too frustrated to keep fighting with it – I actually had other things I wanted to do on my computer this week and I wasn't getting any of it done. So I decided it was time for a new distribution – fast. I decided it was time to see how fat kubuntu came since I last saw it. Now my history with Canonical's distribution(s) have been shaky. Five years ago I got a copy of the first ubuntu release and it's safe to say I couldn't get what the hype was about. OpenLab was a far more advanced distributon both in terms of ease of installation and ease of use at the time and ubuntu's massive resources made this inexcusable – I was one man and I outdid them. Yes, I'll back that up. Just one example: ubuntu came on two CD's – one live disk and one install disk (which was text-only…) OpenLab came on a single CD, an installable live CD (in fact it was the very first distribution to ever do so, it had been possible to install earlier live disks like knoppix manually but OpenLab had an easy graphical installation built into the very CD from version 4 – which came out the same time as the first Ubuntu). 

Over the years I would sporadically try the Canonical systems again. Kubuntu the KDE version developed a reputation among KDE users and developers as the worst choice of distribution for KDE users – it had barely any resources compared to the many in Ubuntu, was buggy and slow and badly configured with horrible themeing and broken defaults. Well I tried it again last night – and credit where it's due. After 5 years- Canonical has finally impressed me. This is one solid distribution, kubuntu finally doesn't suck – and in fact it worked more smoothly than Fedora by a massive margin. I had everything set up to my liking in under an hour. Including the custom things that I usually want to do. The old "thou shalt not touch" policy has been abandoned and instead the system made it easy to find out how to change what I needed to get what I wanted. I had my chosen display setup in seconds. The only glitch was with nvidia-settings not wanting to save the changes, but that was easy to fix (copy the preview xorg.conf file into a text editor save it and copy it into place). When the only bug I found is in software that Canonical cannot fix if they want to (though it's odd that I've never seen the glitch anywhere else before) it's not their fault.

It gets better.

I can't find any sign of pulseaudio anywhere. Despite their initial bullying "you will like it because we tell you to" attitude about it (which led to at least one Ubuntu MOTU resigning) Canonical seems to have finally listened to the masses of users telling them that pulse is broken, doesn't add significant value and makes our lives harder. Pulse is gone ! I am back to good old works-every-time ALSA sound and it's a thing of beauty ! Chromium is in the default repositories – so no need to go download it manually like I had to on Fedora, Amarok seems to work a lot better than it did on Fedora (read: it was so bad I ended up using rhythmbox in KDE rather than deal with it !).

Well done Canonical – you finally built a distro as good as the one-man projects out there –  you actually finally seem to have let your squadron of ubergeeks listen to your users, listen to your community and you've built not only the best release I've ever seen from you – but in my opinion one of the best distributions currently on the market. I still think it's a major issue that you don't meet FSF criteria because you are at a point where everything works so well that I think most users could actually cope just fine if you did – you'd not be sacrificing any major functionality anymore, a few edge cases (like hardcore gamers) may want or need something that you wouldn't be able to support in repositories anymore – but then, those edge-cases are almost by definition quite capable of figuring out how to add just the one bit they need. You've got an amazing distribution – it took you five years of lagging behind almost every other unsung desktop distribution (PCLinuxOS kicked your butts for years, Mint has outdone you everytime, Kongoni was a better desktop distribution – and that was targetted at hardcore geeks of the gentoo-on-a-desktop variety) – you've finally built a distribution that deserves to be in the market leading position you are. 

I admit it- Canonical did a damn good job on Kubuntu with 10.10 and I will for the first time ever be comfortable recommending it to newbies. Well done to the developers – and keep up the good work.

Nov 042010

In a way, I could say every book I've ever read has changed at least some of my views on some things. It would be a pretty piss-poor book that didn't make me question at least some of my ideas, even if the changes it brings about are minor.

But I will focus on one that changed my views fundamentally about something I thought they were already good on – and has been a guiding principle in my career and life ever since. In a break with tradition – I actually met the author of this book before I ever read any of his works. Back in 2001 I was a champion of the open-source idea. I spoke about the technical power that can be unleashed by sharing work and sharing eyeballs. I spoke about the security benefits – and even the fun of being able to customize. I avoided closed source stuff but I didn't actually think they were wrong – just… lesser.

Then I went to the first Idlelo conference to deliver a paper on a distributed educational content deliver network I had been developing. At the time it was groundbreaking stuff which is why I got invited. The keynote speaker was Richard Mathew Stallman. A man I had long held in awe for his programing skill, for founding the GNU project, for writing the GPL – but felt had lacked the bit of pragmatism that "linux" would need to become mainstream.

Sitting through his talk though, the passion with which he spoke resonated with me. I found myself agreeing with him, and coming to the same conclusion he did: that free software isn't a nice-to-have it's a right, perhaps more practically speaking it's an obligation of programmers to provide it. I was sold.  I also laughed out loud and caught every joke in the Saint Ignutius comedy routine and frankly I think those rare few young "geeks" who freaked out about it a year ago are simply proving that they are utterly out of touch with the culture that created the very movement and software they claim to be passionate about. With it's playful, anti-authoritarian nature – and that this nature is more crucial to it's very existence than all the programming skill in the world. 

If you can't take and make a joke – you don't belong in free software development, we can get better code out of a worse programmer who has a sense of humor.

Of course a lot of people wanted his time so my initial opportunities to engage with him one-on-one was limited,  Then I learned that he has a deep love of mountains, and offered to take him on a drive around the mountains of the Western Cape winelands. We spent the trip having deep and intense discussions, mostly I was listening like a student at the feet of a master but sometimes I disagreed and he could debate quite graciously (granted none of the disagreements were about software freedom issues about which I believe he is rather unyielding) .

By the end of our trip and talk, he gave me a signed copy of a book containing his collective essays. I treasure that book. I reread it every now and then. I know every argument by heart and I have spent the past decade living by them. I am in occasional e-mail contact with him. While I was leading kongoni development whenever we had to make a judgement call about a piece of software I would mail him for his input and take it as a major voice. It was by his encouragement that Kongoni included software to play DRM'd media – software that is illegal to distribute in the USA or violates patents. My country doesn't have patents and his advice was clear: do not give the bad guys more power than they have, you still have freedom from patents, you don't have a DMCA  – let the tools people need to not be ruled by it be in your system. 

A champion of open source became an unyielding advocate for free software and I can say with pride that kongoni was a fully free distribution under my leadership – and recognized as such by the FSF. That when I handed leadership over to another I did it on condition of his promise that he would maintain that status (of course – he is not under any legal obligation to – all I have is his word, but he's kept it so far). I had a look at the most recent release the other day  -it's quite sweet, he's really done good work using what I built and building on top of that.

Believe it or not, I'm more proud of what was built out of my creation than of the creation itself. That I could write the first versions of those code is a matter of pride, that somebody else could write something better because he could start where I left off – is a matter of greater pride. Newton spoke of standing on the shoulders of giants.  The giant whose shoulder's I stand on is Richard Stallman, and the values I adopted from him – has allowed me to be a giant on whose shoulders somebody else could stand. 

It really is a great book, by a great man.


Aug 172010

Just a few years ago, the running geek joke was that Larry Ellingson was second rate, knew it, loathed it and suffered under the whithering scorn of Microsoft who was at that stage "outcompeting" Oracle on every front. Heck MS-SQL was even outselling Oracle Database as impossible as that may sound today.

Well the joke is over that’s for sure. Larry spent the last few years on a drive of targetted acquisitions that usually ended up buying companies for their products and putting most of the employees who created those products out of a job (that btw. of ye who worship the "invisible hand of the market" does NOT count as economic growth. Bigger companies with FEWER employees is bad for everybody – including customers) culminating now in the acquisition of SUN.

Unlike most such acquisitions Oracle did not need to fire most of SUN’s top engineers – they almost all walked out on the first day in protest. These were people who worked for what was once perhaps the closest thing to noble any corporation could be, a company that was once rated the best I.T. company in the world to work for – founded by engineers. Oracle’s culture is almost a polar opposite – it has always and forever been for them about one thing only: how much money can we make.

Oracle’s purchase of SUN gave them control over a number of major technologies – the sun hardware business being practically speaking the least of them. SUN may not in recent years have been very good at monetizing their assets but the software technologies they owned were nonetheless disruptive, innovative and major forces in the market – and now Oracle owns them all.

They own MySQL – a database that was rapidly chewing away at their market share. Most analysts never realized just how huge a threat to their primary bottom line MySQL really was. A few more years, MySQL may have supplanted Oracle as the market leader in databases and the number two spot would have belonged to PostGreSQL. If you thought Oracles competition was Ingress and IBM’s DB2 they looked untouchable – but while this fooled analysts (and oracle was happy to keep them fooled) it wasn’t a true picture. Oracle knew very well that MySQL and PostgreSQL had the capacity to take over the database market from them and the inevitability of that success which is practically built into any successfull FOSS business model.

So Oracle bought SUN to get MySQL. The other major technology they wanted was Java. That most beloved of academic languages that somehow never took off on desktops or the web it was supposedly created for. It didn’t take off on desktops because frankly the story of Java the web-language was a bit of marketing. James Gossling and his team had designed oak: a language created for mobile and embedded systems, to capitalize on a coming revolution. SUN wasn’t wrong in predicting said revolution – they were just 15 years too early, so in the meantime they reinvented Oak into Java, called it a web-language and got it out there, getting a stable of developers ready.

Java expanded it became a darling of back-end services and application-service systems (tomcat is a lovely example). It became a cornerstone language in the market for many tasks (developing user-facing desktop applications was never it’s strong suit but there’s a lot more to the programming world than those) – and when the embedded revolution did come, Java was it’s darling.

It still is, J2ME is the most widely usable phone development platform there is. Android apps are written in a slight variant of desktop Java (but Android can also run J2ME apps through a compatibility layer). Even Windows7 phones support Java apps. The only exception is Android’s biggest rival: The Iphone.

People talk about Steve Jobs’s refusal to allow flash on the iphone but much more important is his continued prevention of java as a language. Both are prevented for one reason only: it makes iphone into a walled garden, whose apps run on nothing else, and which cannot run apps developed for anything else. Such deliberately blocking of interoperability is bad for the consumers and gets worse in the long run – in fact, it’s a classic Microsoft business technique (less so nowadays because Microsoft is frankly not as powerful as it once was and cannot get away with it so easilly).

Android is the great thorn in Apple’s side – a platform that gives comparable features while being open and interoperable breaks down the value of their walled garden approach. Apple however never had the gutts to sue Google – instead they sued HTC and other handsent manufactuers – their hope being to scare the handsets away from googles stack with the very real threat of patent litigation.

So far, nobody has backed down so I think Apple’s plan isn’t working very well for them. Larry Ellingson however, did not sue HTC. Larry went after google itself. It doesn’t have much choice really – their claim is that google’s adapted desktop JVM on a phone (rather than a desktop computer) violates the Java licensing (those parts that aren’t GPL’d at least) and patents. Patents which recent posts by people like James Gossling reveal to have been filed for absolutely no other reason than to build SUN a defensive position when other companies sued the once patentless company over trivial patents and won. Patents created through a "lets see who can get the stupidest patent granted" competition among the staff !

Now those patents belong to one of the most unscrupulous businessmen in I.T. today. The suit against google is about one thing – firmly cementing Oracle as the dictator over Java. They who shall decide which java features are available on which platforms. Google perhaps has some room for a defense based on stretching the defintions of desktop computer. Android is pretty close to a desktop OS as it is, and tablets will bring it even closer (much as it did for Apple). As the line between "phone" and "pc" has gotten blurrier – perhaps the legal seperation of the concepts aren’t so clear anymore either. I’m no lawyer so I won’t debate the viability of this but it’s worth considering that when J2ME was created with it’s smaller feature-set (a feature-set not good enough for Androids capabilities) phones (and the apps they could run) were far less powerful than they are now. My HTC Desire has more processing power than any of my first 5 computers. It just happens to fit in my pocket.

Oracle wants control over Java at that level. Sun already gave us the core java technologies under the GPL which makes oracle weak in what they can do with it, but here they are showing the power of patents. The Harmony class-libraries from apache were based on the GPL’d java source code, Android’s JVM is based on Harmony – yet Oracle is asserting a power that the GPL specifically removes: to control where and how the code may be run. Harmony remains an uncertified Java set – because to get certified requires one to comply with an additional license that removes almost all the GPL freedoms.

Oracle didn’t go after Harmony, at least – not yet, they went after Google and they have one goal in mind here: to take back control over Java. Ironic because it’s exactly the fact that SUN has been evermore relaxed about controlling it over the years that allowed it’s continued growth. It remains one of the few parts of SUN’s software business that was actually profitable right to the end.

But control Java, and you control a huge section of the software market, particularly that part where Oracle is the strongest. If you destroy it in the process ? So what. Oracle DB will only get stronger if that happens – they would much rather lose the Java revenue to protect their database market at all costs.

So does this mean the end of Java ? This lawsuit already has companies clamoring to start processes to move their code from Java to other platforms which has a largely negative knock-on effect on everybody (and ultimately the worst on consumers) so it’s already done terrible harm. It is likely to get worse. If Google prevails, or comes out with a good settlement – then mobile Java may yet survive – it’s too huge a market to die easily. If they fail – even that is dead.

But Java as we know it died the day Larry Ellingson filed that lawsuit. It will spend quite a few years on involuntary muscle spasms as the case drags on – but it’s dead. In the interest of consumers and corporates and everybody else outside Oracle it is now truly vital to viably replace all of Java with a truly free alternative. The good news is that the core Java technologies ARE GPL’d. Java may be dead – but it is now time to ressurect it, in a new form without corporate control. Use th GPL’d code that SUN gave us before it’s demise and rebuild the rest from the ground up. We weren’t far from it even before -nothing should stop us now.

I propose this as the new number one entry on the FSF’s important-projects list. We need a free J2ME, a free JVM, a free servlet engine. I write as somebody who learned Java at University and never voluntarily used it since. I despise the language, I find it clunky and hard to read and harder to build with and I much prefer leaner and cleaner languages like python myself, but I recognize the value Java and it’s position has brought to computing, I recognize the harm it can do to once more revert this power into a single corporate entity’s hands. In fact it will be far worse now. Java is much more powerful, and it’s not Oracle’s primary product for them it is nothing BUT a means of control – so they will fight to control it entirely, and with it a thousand companies and a million developers and a hundred million users.

I may not like Java – but I know we cannot let that happen.

Aug 042010

Lawyers have successfully managed to argue that computer programs are not mathematics and thus should not be covered by the exclusion of mathematics from patentable material. This comes from a deep misunderstanding of how computers really work – particularly as implementations of a universal turing machine. Some great papers on this have been written – including this one at groklaw. That explains in detail how computers really work and why all computer programs are simply mathematical functions – and even why all mathematical functions are really just numbers.

A great quote from it is this one: “Programming a computer is, essentially, just discovering a number that suits the programmers wishes”.

The thing is – for somebody whose only understanding of computation theory is even that paper- this will seem like a bit of a leap. After all the process of writing code is creative, involves design and innovative thinking – surely this wonderful process cannot just be “discovering a number” – after all – you can do that just by counting – this is WHY it’s unpatentable…

What I want to do with this post is to – very simply – explain why that really is true. I’m going to give you a very simple computer program. I’ll write it in pseudocode so non-programmers can read it, but it can be implemented easily in any programming language and run – and in most of them will take less than about 10 lines of code to do:

Make the vairable X equal to 0;
Start a loop here:
Write the binary representation of X into a new file.
increase X by 1
continue the above loop until the program is interrupted by deliberately killing it (an infinite loop);

With this simple program – I can create an exact copy of every single program ever written and – this is important – every single program that CAN ever be written.
This is because any compiled program becomes a file filled with zeros and ones – to a computer, that’s just a big number (the whole computation theory and lambda calculus etc. that explains how a number can BE an algorithm is needed to know how this happens – but the important thing is – it’s a number). This program will store every number that can exist into a file – by just counting.

The process is very ineffective for a few reasons: firstly almost every program it produces won’t run, the vast majority of numbers do not correspond to useful programs – in fact only an incredibly small subset of them do – but they are still numbers you can count to, and they are still numbers my program WILL produce. Secondly there is no real way to determine the useful programs from the ones that aren’t- you have to manually try to run all of them – and see for yourself what happens. More-over for every program in there, you’ll produce thousands of copies – some that will only run on other computers than yours. But somewhere in there will be a full version of Microsoft OutLook that can run on your computer… if you run it long enough at least.
Another inefficiency is that it creates every program as one self-sustained entity – as it ends up in memory, but programs aren’t sold like that. Programs have many parts that are identical between them (just like the number 105 and the number 316 both contain the number 1 – just one a much bigger scale) – it’s smart to store these in separate files so multiple programs can use them – it saves disk space, but it doesn’t change the number that actually goes into memory when it is run, it merely stores it more efficiently by avoiding replication.

The process is fully doable however, it would take a massive amount of time to discover just the subset of numbers that correspond to a runnable program – let alone the ones inside that do anything useful – and of course since you’ll also be generating every virus program ever – the process is likely to be rather harmful to your computer.

So indeed, you CAN discover every computer program ever written, and every computer program that WILL ever be written just by counting – just like you can do any multiplication sum by adding up numbers repeatedly. But it’s a very crude an ineffective way of doing it, learning multiplication saves a lot of time and effort for the same result, but even though it’s a faster process- ultimately it is STILL just adding up repeatedly.
Programming in the end is a technique whereby we can very efficiently narrow down onto the numbers that are truly useful, we use principles of engineering and mathematics to skip the addition and multiply directly as it were – going straight to the number we are looking for. In fact it’s not a perfect process – that’s why all programs have bugs – we don’t get to exactly the perfect number for the program we want – we get to a number that’s so close however as to make no difference. Somewhere in there is the number that will be the perfect bug-free version of the program. The counting way could find it (but will be hardpressed proving it did) but the programming methods won’t – the closer you get the harder it gets to narrow down, the very same things that make programming a much more efficient way than counting to find useful software, also makes it’s results slightly less perfect. It’s sort of like calculating the value of Pi, the longer you go on – the more accurate your answer becomes but you never quite get there and the cost of one more digit of accuracy must eventually be higher than the value of having it.

So there you have it – not only is every program a number – they are all simple integer numbers that can be counted, and all the great skill and artifice of the programmer is really just a much more efficient way of finding the number we want – rather than counting through them all and checking if it is the one we want (which could take centuries to be honest).

You cannot ask for a more simple piece of proof that a program is in fact a number – that software is discovered rather than invented. The fact that we have very effective methods of discovering them does not change this and doesn’t change that you should not be able to patent numbers.

I still highly recommend reading the article I linked – especially if you are a lawyer or activist involved in the software patent field as it explains the underlying theories very clearly. Effectively it tells you how it came to BE that these numbers are useful, and how we derive their particular useful meaning from them. That process of derivation is what computer hardware does, and a better tool to do that with is patentable (which is why you should indeed be able to get patents on computer hardware), but go in with this basic understanding. That every computer program really is just a number, that those numbers can be reached by simple counting – I’ve proven this to you here, and all of computer programming – as wonderful and delicious and artistic a field of endeavor as it is – is really in the end – just a faster way to count to a number that we like.
This does not reduce from programming any of it’s artifice, if anything it adds to the merit of the field because the processes by which we count are complex and fantastic and beautiful and we are always looking at ways to count even more effectively so we invent new programming languages and ideas like agile programming to help us do it even better – but in the end, the results is just a number that anybody could have counted to – and that is NOT an invention you can patent.

UPDATE: Something I didn’t make clear above but which is important – is that you will generate not only every program that can exist but every FILE that can exist. This includes for example if you read them all as .jpg – every digital photograph – photoshop’d or not that can be taken. A digital camera is just a very efficient way to get to the number that represents a picture – it’s still art. Photoshop is a way to manipulate that same number with small algebraic changes to get to one very near it, but slightly different – it’s still art. This is why I say that this reality doesn’t reduce programming’s artistic and creative status. If you read them all – every text file that can exist is in there too, from the bible to Shakespeare’s Macbeth. But it also includes about a million numbers right NEXT to the one with Shakespeare’s Macbeth which differ only in that an A on line 6000 has been replaced by a Z for example. Again – authors seem to be real artists for finding the “magic number” without counting and checking every possible variation – indeed for doing so long before we had the mathematical knowhow to turn something like MacBeth into a number and back again. The PDF versions are in there too. Every music file, the mp3 compressed ones and otherwise. Every news report and every youtube video will get generated.
It will also be damn near impossible to find anything in there by looking manually – you’d have to study each number just to figure out if you should run it, try to boot it or open it in a video player ! What’s worse there’s real CPU specific stuff in this approach – the 64-bit version of outlook will be a very different (literally – an order of magnitude different) number from the 32-bit version of the same program.
The nice thing is that if you find the right source* file number, you can generate all the possible binary file numbers from it. You’d need to wade through a few thousand ones just like with MacBeth that are almost but not quite right – except for one altered or missing character somewhere.
So programmers focus on finding the the magic number for the file with the source code – because find that one, and you can jump straight to any of the executables magic numbers with a single calculation which we call “compiling” . See what I mean by “a much more efficient way to count to a useful number” ?

*Text files, executable, source code, pdf’s all files in fact are saved as just one gigantic number on a computer. The computer just follows a set of rules to make sense of them. The exact rules differ between architectures – on an 8-bit computer if you tell it that the file is “text” it will read every 8 digits, take that as a number by itself and find a corresponding letter from a chart (known as the ascii set), on 32-bit and higher computers it reads more – and can refer to longer and more complete charts like unicode – but ultimately – what gets saved on the disk is still just one big number. Here-in lies the secret to what lets the “universal Turing machine” actually work – software is data.

Feb 052010

It hit my mailbox today – the decision to put the entire letter in a jpg file probably got it past gmail’s spam filters… but sheez, this is absolutely terrible… terribly perfect as a rip-off that is…

Get this:
1) Firstly the spelling and grammar is actually almost acceptably good !
2) It puts a whole new spin on the usual “God bless you for your help” and other religious crud in the “help me collect my dead husbands fortunes” 419s by pretending to be from a Christian in Saudi who had converted (along with the dead hubby) from Islam
3) It then goes on to state that the purpose of the money for her is to use it for charity ! To build things like cancer research centers !
4) The “I have cancer” bit is a nice (if rather fucked up) twist…

Sheez… I can just see a million fundamentalists falling for this one… Here is the letter as I receive it.
Please if you get this – IT IS NOT REAL. These scammers have in the past committed fraud, theft, kidnapping and even more violent crimes than that against people who respond. Do not fall for it.

Oh, and whichever scammer came up with this one… you know, “Sister Mary Jones” is really not a very believable name for a lady who was born to a Muslim family in Saudi Arabia !

419 Scam

419 Scam

May 052009

Now as you may know, in my dayjob, I’m a unix system administrator. The users who use my servers are not secretaries, in fact they are unix programmers… you would imagine a rather high level of unix competence among them… wouldn’t you?

Well… never ever assume competence among professionals I guess… here are the top 5 funniest responses I have had to give to them:

  • *Please explain to me why you need a 5 terrabyte of space to hold “Sample test data” ?
  • *If you save your work in /tmp, I’m afraid you cannot blame me if it gets deleted.
  • *Why exactly would you check out a project from the source repository, onto a network drive… and leave it there ?
  • What’s worse: that was a plural “you”.

  • *I’m sorry, I cannot install the software you requested, because it doesn’t exist
  • I’m not making this up.

  • *Next time you wish to kill my server, do me a favor, use a forkbomb. An endless loop that allocates 5gb of disk-space to random data on every run, in a server sitting on a VM with a growable drive is just too cruel…
  • No, I’m still not making this up.

Sigh… and the last one, today, while I’m sick and wishing I was in bed.