Nov 052010
 

So Fedora 14 coming out meant I wanted to try it. I've been F13 on three machines so far: my work laptop, my media-player machine and my gaming desktop. On my work laptop the upgrade went smoothly and it runs beautifully, the reasons why I first switched to it (resemblance to the RHEL systems running on the servers) still apply – and I have gotten pretty adept at Fedora's little quirks so I'll keep it there- it works wonderfully in the office. The media machine is barely affected by the choice of distro because once set-up the only software on it that matters much is XBMC so I won't be installing any upgrades on it soon – it's not like it's ever going to be at risk of security breaches – all it does is play movies.

My gaming desktop however was another matter. From Fedora 13 to Fedora 14 there was a regression in usability on the kind of setup I got there that was so extreme that I couldn't bear it. Upgrading failed miserably leaving the system barely functional so I did a clean install… and the problems didn't go away (I suppose not using the live media made it harder but Fedora's design means if you want to save bandwidth by reusing your download you already did you can't do so with live media at all) – either way, the nouveau driver while coming along nicely is simply not good enough at the primary task (accelerating 3D) yet to use for gaming. Bugger. That's where things got hectic. It took hours of figuring out and googling to get the nvidia driver to work at all – and then it would only work on one screen at a time – so much for the lovely dual-screen setup I've used for nearly 3 years now ! 

Fedora's pulseaudio has been my biggest annoyance with it ever since F12 as I still think pulse is a solution looking for a problem, not finding it, and thus creating a whole bunch of new ones instead. Fedora 14 however proved to be a massive headache on every level. I don't much blame Fedora for the nvidia difficulties – that's nvidia's fault for not having a free driver, and the third-party packagers for doing the worst job they ever did with it, but yum and packagekit reached new levels of broken integration, the upgrader originally didn't bother to update my repositories (not even the official fedora ones) to know I've changed releases… basically I'm sorry but F14 is the worst desktop release Fedora ever did and it made it completely useless for my home desktop. It seems to work fine for the business oriented usage of my laptop however, if that's all Fedora developers care about, then it's all I'll use their work for.

By 10pm last night I was simply too frustrated to keep fighting with it – I actually had other things I wanted to do on my computer this week and I wasn't getting any of it done. So I decided it was time for a new distribution – fast. I decided it was time to see how fat kubuntu came since I last saw it. Now my history with Canonical's distribution(s) have been shaky. Five years ago I got a copy of the first ubuntu release and it's safe to say I couldn't get what the hype was about. OpenLab was a far more advanced distributon both in terms of ease of installation and ease of use at the time and ubuntu's massive resources made this inexcusable – I was one man and I outdid them. Yes, I'll back that up. Just one example: ubuntu came on two CD's – one live disk and one install disk (which was text-only…) OpenLab came on a single CD, an installable live CD (in fact it was the very first distribution to ever do so, it had been possible to install earlier live disks like knoppix manually but OpenLab had an easy graphical installation built into the very CD from version 4 – which came out the same time as the first Ubuntu). 

Over the years I would sporadically try the Canonical systems again. Kubuntu the KDE version developed a reputation among KDE users and developers as the worst choice of distribution for KDE users – it had barely any resources compared to the many in Ubuntu, was buggy and slow and badly configured with horrible themeing and broken defaults. Well I tried it again last night – and credit where it's due. After 5 years- Canonical has finally impressed me. This is one solid distribution, kubuntu finally doesn't suck – and in fact it worked more smoothly than Fedora by a massive margin. I had everything set up to my liking in under an hour. Including the custom things that I usually want to do. The old "thou shalt not touch" policy has been abandoned and instead the system made it easy to find out how to change what I needed to get what I wanted. I had my chosen display setup in seconds. The only glitch was with nvidia-settings not wanting to save the changes, but that was easy to fix (copy the preview xorg.conf file into a text editor save it and copy it into place). When the only bug I found is in software that Canonical cannot fix if they want to (though it's odd that I've never seen the glitch anywhere else before) it's not their fault.

It gets better.

I can't find any sign of pulseaudio anywhere. Despite their initial bullying "you will like it because we tell you to" attitude about it (which led to at least one Ubuntu MOTU resigning) Canonical seems to have finally listened to the masses of users telling them that pulse is broken, doesn't add significant value and makes our lives harder. Pulse is gone ! I am back to good old works-every-time ALSA sound and it's a thing of beauty ! Chromium is in the default repositories – so no need to go download it manually like I had to on Fedora, Amarok seems to work a lot better than it did on Fedora (read: it was so bad I ended up using rhythmbox in KDE rather than deal with it !).

Well done Canonical – you finally built a distro as good as the one-man projects out there –  you actually finally seem to have let your squadron of ubergeeks listen to your users, listen to your community and you've built not only the best release I've ever seen from you – but in my opinion one of the best distributions currently on the market. I still think it's a major issue that you don't meet FSF criteria because you are at a point where everything works so well that I think most users could actually cope just fine if you did – you'd not be sacrificing any major functionality anymore, a few edge cases (like hardcore gamers) may want or need something that you wouldn't be able to support in repositories anymore – but then, those edge-cases are almost by definition quite capable of figuring out how to add just the one bit they need. You've got an amazing distribution – it took you five years of lagging behind almost every other unsung desktop distribution (PCLinuxOS kicked your butts for years, Mint has outdone you everytime, Kongoni was a better desktop distribution – and that was targetted at hardcore geeks of the gentoo-on-a-desktop variety) – you've finally built a distribution that deserves to be in the market leading position you are. 

I admit it- Canonical did a damn good job on Kubuntu with 10.10 and I will for the first time ever be comfortable recommending it to newbies. Well done to the developers – and keep up the good work.

Nov 042010
 

In a way, I could say every book I've ever read has changed at least some of my views on some things. It would be a pretty piss-poor book that didn't make me question at least some of my ideas, even if the changes it brings about are minor.

But I will focus on one that changed my views fundamentally about something I thought they were already good on – and has been a guiding principle in my career and life ever since. In a break with tradition – I actually met the author of this book before I ever read any of his works. Back in 2001 I was a champion of the open-source idea. I spoke about the technical power that can be unleashed by sharing work and sharing eyeballs. I spoke about the security benefits – and even the fun of being able to customize. I avoided closed source stuff but I didn't actually think they were wrong – just… lesser.

Then I went to the first Idlelo conference to deliver a paper on a distributed educational content deliver network I had been developing. At the time it was groundbreaking stuff which is why I got invited. The keynote speaker was Richard Mathew Stallman. A man I had long held in awe for his programing skill, for founding the GNU project, for writing the GPL – but felt had lacked the bit of pragmatism that "linux" would need to become mainstream.

Sitting through his talk though, the passion with which he spoke resonated with me. I found myself agreeing with him, and coming to the same conclusion he did: that free software isn't a nice-to-have it's a right, perhaps more practically speaking it's an obligation of programmers to provide it. I was sold.  I also laughed out loud and caught every joke in the Saint Ignutius comedy routine and frankly I think those rare few young "geeks" who freaked out about it a year ago are simply proving that they are utterly out of touch with the culture that created the very movement and software they claim to be passionate about. With it's playful, anti-authoritarian nature – and that this nature is more crucial to it's very existence than all the programming skill in the world. 

If you can't take and make a joke – you don't belong in free software development, we can get better code out of a worse programmer who has a sense of humor.

Of course a lot of people wanted his time so my initial opportunities to engage with him one-on-one was limited,  Then I learned that he has a deep love of mountains, and offered to take him on a drive around the mountains of the Western Cape winelands. We spent the trip having deep and intense discussions, mostly I was listening like a student at the feet of a master but sometimes I disagreed and he could debate quite graciously (granted none of the disagreements were about software freedom issues about which I believe he is rather unyielding) .

By the end of our trip and talk, he gave me a signed copy of a book containing his collective essays. I treasure that book. I reread it every now and then. I know every argument by heart and I have spent the past decade living by them. I am in occasional e-mail contact with him. While I was leading kongoni development whenever we had to make a judgement call about a piece of software I would mail him for his input and take it as a major voice. It was by his encouragement that Kongoni included software to play DRM'd media – software that is illegal to distribute in the USA or violates patents. My country doesn't have patents and his advice was clear: do not give the bad guys more power than they have, you still have freedom from patents, you don't have a DMCA  – let the tools people need to not be ruled by it be in your system. 

A champion of open source became an unyielding advocate for free software and I can say with pride that kongoni was a fully free distribution under my leadership – and recognized as such by the FSF. That when I handed leadership over to another I did it on condition of his promise that he would maintain that status (of course – he is not under any legal obligation to – all I have is his word, but he's kept it so far). I had a look at the most recent release the other day  -it's quite sweet, he's really done good work using what I built and building on top of that.

Believe it or not, I'm more proud of what was built out of my creation than of the creation itself. That I could write the first versions of those code is a matter of pride, that somebody else could write something better because he could start where I left off – is a matter of greater pride. Newton spoke of standing on the shoulders of giants.  The giant whose shoulder's I stand on is Richard Stallman, and the values I adopted from him – has allowed me to be a giant on whose shoulders somebody else could stand. 

It really is a great book, by a great man.

 

Sep 042010
 

Today I undertook the task of installing CyanogenMod 6.0 on my HTC Desire, the process can be quite convoluted if you read the various docs, and what’s worse is they are all quite windows centric, in fact as I went along I found it to be much simpler than the docs make it seem and thought I would document how I did it.

Rooting your phone and loading a different Android OS on it automatically violates the warrantee. If these steps cause your phone to explode and blind you then I take the same responsibility as the CyanoGenMod developers do: which is to say – none whatsoever. You have been warned. Note that some parts of this was taken directly from the cyanogenmod wiki – small bits where I really couldn’t add anything useful.

The one mistake I did seem to make was to buy a new spare 2Gb microSD card for the experiment. In retrospect, I did nothing with it I couldn’t have done with the 4Gb (though that is qualified by the fact that I don’t keep important data on the SD card).

The first step was to backup my contacts. I went into the “People” application, hit Menu and then export to SD-card, I did this for both google contacts and sim contacts. I then connected to my PC and copied these files to my hard drive so they would be easy to recover if the data got lost somehow.

Most howtos suggest you will have to have a microSD reader – I am not so sure of this, I used one but you can probably get away without one. I will say that it helped to bypass the phone for some steps however. Even so I the small microSD card reader I bought cost me all of R70 so it was hardly a major expense.

You will need to download the root system for the phone as well as make a gold card.

Let’s start with the goldcard. There are many docs out there that use the Android SDK and make you do difficult steps that aren’t distro or OS neutral… basically it’s a schlep – here’s the easy way. With the SD-card you want to use, go into the android market and install GoldCard Helper. This app will when you run it produce the code you need, then let you copy it to the clipboard, open the website, paste it and have your goldcard.img mailed to you in a very simple set of steps. Why do it the hard way when there’s a nice automated tool to do it for you and protect you from typing errors ?

When you have the goldcard image, stick the SDcard in your card-reader, and copy the image file onto it with dd:
dd bs=512 if=goldcard.img of=/dev/sdd
Note that your disk device name may not be sdd replace with the right device name, right after you plug it in – dmesg should show it to you*

Once done – mount the sdcard, and copy the update.zip from the rootkit I linked above onto it (note that this is the rootkit for bootloader 0.80 – if you aren’t sure what this means, first read up on that as earlier versions need a different rootkit).

Now powerdown your phone, start it up and boot it again while holding down the back button, make sure it’s connected via USB. You’ll get to the fastboot menu.

In the directory where you extracted the rootkit run this on your PC: ./step1-linux.sh

Go to Bootloader|Recovery

You’ll get to a black screen with a red triangle on it. Hold down the volume button and tap the power button. This brings you to the recovery screen. First run through “Wipe data” then when this completes run through ”Apply sdcard:update.zip’ make sure you are connected to the PC the whole time.
The process takes a while but once it completed, pull out the battery, boot back up – and your system is rooted. The Desire will run through it’s initial setup screen at this stage (all your settings having been wiped – I did warn you).

That’s the first phase: your phone is now rooted.

Now go to the market and install Rom Manager. Once in Rom Manager install the ClockworkMod first which Rom Manager uses to select boot images.

This takes a while.

Use the Backup option to back up your current rom.

This will also boot you into the ClockWorkMod recovery system (Rom Manager lets you autoboot in here anytime). And this is where I got stumped – and had to google for an answer. Unlike the HTC’s own recovery menu, clockworkmod does NOT use the powerbutton for select, you still move around menus with volume, but you select with the trackball.

Download the latest version of the radio (5.09.05.30_2).
Place the radio update.zip file on the root of your SD card.
Boot into the ClockworkMod Recovery.
Once you enter ClockworkMod Recovery, use the side volume buttons to move around, and the trackball button to select.
Select Install zip from sdcard.
Select Choose zip from sdcard.
Select the radio update.zip.
Once the installation has finished, select Reboot system now. Now the HTC Desire’s Baseband version should now be 5.09.05.30_2.

When you boot, you will first see a weird icon, and the HTC will appear to hang for several minutes, don’t panic, it boots up eventually.

Boot the phone and run rom manager again.

Go to Download Rom. CyanogenMod should be right at the top of the screen. Tap “Stable Release” and wait for the download to complete. Rom manager has an option (which will pop up now) to automatically add the google apps to cyanogenmod (which the primary distribution of it cannot include for licensing reasons) – add them if you want them.

Once the ROM is finished downloading, it asks if you would like to Backup Existing ROM and Wipe Data and Cache.

If Superuser prompts for root permissions check to Remember and then Allow.

The phone will now reboot into recovery, wipe data and cache, and then install CyanogenMod. When it’s finished installing it will reboot into CyanogenMod.

Feb 162010
 

Most of you probably know this by now, but just over two weeks ago – I announced the end of my involvement in the kongoni project. The reasons were stated in the original post so I won’t be rehashing them here. I did however state that if somebody volunteers to take over the leadership – I will gladly pass it on, and help the person to get going.

The good news is, less than 48 hours later, such a volunteer emerged. We’ve been working together quite hard over the past
few days as I taught him the structures and set up access for him to the various pieces of infrastructure that make up the build systems for Kongoni. By mid-week he had done his first ISO build and by yesterday he was starting to get ready to do git commits and publish his first changes to git-current.

He’ll have a steep learning curve still to get to know the system’s many ins-and-outs like I do but he’s at a working level and progress can once more begin. I am very happy to be able to tell you all this as it means that my greatest regret about leaving kongoni – that it left the users without an upgrade has been resolved.

So I am happy to announce that Robert Gabriel is the new leader of the Kongoni project, he has already launched a rather
spiffy new Kongoni website which I urge you to check out.

For myself, this by no means ends my involvement with free software, least of all with the fully-free-distribution movement, it
merely shifts my direction to something more feasable for me as a person with my particular practical considerations at this time. I have, in the time since the anouncement, accepted an invitation to become a contributor to the gnewsense project. I am slowly learning the ins-and-outs of the gnewsense ideas and my initial progress has been slowed by dedicating time helping Robert get started – but I have as an initial step taking responsibility for adding and maintaining a chromium package for Gnewsense. In the future I intend to get quite heavilly involved – and possibly take over most of the maintenance on gnewsense-KDE as currently there is very limited work done there (largely due to lack of manpower).

So, here’s to the future. The kng is dead, long live the kng.

Dec 092009
 

I won’t get into the concerns about whether google chrome is proper free software right now, mostly because I’ve started a discussion on it with the gnu/linux-libre group, which is a coalition of free distro developers where we collaborate and discuss these things together – and I don’t want to push anything until that conversation is done. Instead, here’s my review of the google-chrome browser’s official GNU/Linux beta release as I found it in my testing – with a mostly technical focus.

This also means there won’t be a kongoni port just yet – whether there will be one depends on the outcome of the aforementioned discussions.

I received a mail from google last night (which I’d requested) to inform me that the Linux beta for google-chrome is now officially available, followed the link and got greeted by a nice XKCD-esque comic about it. Followed another link and got the download page with the ugly EULA. Oh well. Packages were available for a few major distros – four in all, 32/64 bit RPM’s and 32/64 bit debs.

Kongoni can convert either to a usable format, and I’d previously done some chromium testing using the debs, but I opted for the RPM’s here. Doubt this makes much difference but just for interest. I grabbed the 64-bit one, ran rpm2tgz on it, and installed it. It created a /etc/cron.daily script which is meant to install the regular updates, currently of course, this won’t work, but if I end up supporting it, the kongoni version can easily enough replace it with it’s own. The RPM version’s script seems to use yum, presumably the deb version will use apt.

That out of the way, the next clencher was that it missed some libraries from mozilla-nss, which was odd since I have it installed. Double-check, the library names weren’t the same – close but it wanted additonal 1d and 0d extensions, a couple of symlinks sorted that out.

It came up, seemed to work – but wouldn’t render anything, checked the console output – lots of shm messages. Okay, I know that one from earlier experiments, set /dev/shm to world-writeable and retried. It imported my firefox settings, including saved passwords and bookmarks wonderfully – and suddenly, it works sweetly.

Okay, start playing… it’s fast, very fast. Faster than I remember from testing on windows… much faster. It’s slick, easy to use and just flows around the net. Played a bit with extensions – installing one for facebook and one for twitter – both worked instantly, without requiring a restart, and ran very nicely (though the buttons-next-to-the-address-bar choice may not be their best decision, that could get very cluttered fast for people with lots of extensions). Some googling around failed to find an addblocking extension just yet or anything for laconi.ca but I may just have not looked hard enough.

Still, I rather like it, it’s bleeding fast, beautifully rendered.. just about perfect in fact. The slight difficulties installing is probably because I didn’t built a proper kongoni port, and thus had to do manual effort to sort things out when converting a package built for another distro. All in all, I think google and their volunteer developers on chromium did and awesome job with the port. Well done, when my only gripe is a minor one of aesthetics – that says something.

Sep 232009
 

This is a post based on personal experience that led to a fairly major outage for me recently, I won’t share any specific details therefore, but I will explain the issue so others can be warned. The automountd in question was running on an older version of hpux so I suppose it’s possible that newer Linux systems have some kind of protection in it, but sine the flaw is fundamentally part of how automount works – I doubt it.

Imagine you have an nfs share, that contains a lot of directories, various clients will only access some of them, now one popular setup here is to set the master share as an automount – hooked into the subdirectories. Lets say you set this up on /shared_files

Now when a user tries to enter /shared_files/documents for example – the automounter will send a mount request to the NFS server, mount the documents directory directly, and the user transparently gains access… sounds perfect right.

Here’s the flaw… what happens if the user tries to access a directory which doesn’t exist in the share ? Say /shared_files/garbage … well a mount request gets sent, the server rejects it – and the user gets a file-not-found.

That’s all well and good right?

But now… what if I do this:

while /bin/true ; do

ls /shared_docs/$RANDOM

done

See what happens now: instant denial of service attack on the NFS server. Normally, NFS is fairly safe from DoS’s as it’s usually not used online and generally one inside the company would need root access to issue a mount request- but this can be done by any user, and worse on any server he has access to (so it could be distributed) and just to add the cherry on top, similiar scenarios could just as easily spring from stupidity or a buggy program/script – there isn’t even any need for malice…

This problem isn’t limited to NFS, you’d have the exact same issue if you were using CODA or practically any other network file system. Essentially automount, when used on a “in the directory” level – is a disaster waiting to happen, it’s a daemon that executes a root privileged command when triggered by actions a non-privileged user can perform… inherently this is very dangerous.

It is for this reason that I am piece-by-piece ridding my network of automount based setups, and switching to rather just mounting the /shared_docs equivalents using fstab directly (besides which, one on-boot mount request is so much less overhead than hundreds of on-access requests)

Sep 072009
 

Hello, and welcome to this, the final part of my tutorial series on doing photoediting with GNU/Linux and free software. I say final because after this, we’re getting into the professional arena for which good books are a much better option than blogposts. What I will be covering in this is some basic photographic handling skills and how to do them in the gimp. I won’t be using channels or layers in this tutorial but everything in here remains applicacable when you use them.

A common problem for a photographer when you shoot outside a studio, is that your light and color and backgrounds are not ideal, so we’ll learn how to do some basic corrections, and how to highlight our subject against a cluttered background. Here is a picture I took at the 7thson gig, it was near the end of the show and to get it, I had to use a very high ISO level as the light was fading and I wasn’t using a flash. The price I paid was digital noise – and the background is rather cluttered.
Here you see the image, in UFRaw, with our first bit of highlighting already busy: we’re cropping out some of the wasted stuff.

I’ve not done much else in RAW on this one as the shot was really rather good, I only raised the exposure level very slightly.

Clicking on OK gives us the image in GIMP:

Our first problem is the digital noise that came from using a high ISO level, Gimp has a built-in tool to help correct this, called Despeckle (Filters|Enhance|Despeckle). UFraw also has such a tool but it offers far less control, despeckling costs you detail so you want to aim for an optimum balance to lose as little as possible and maintain a usable picture (if perhaps not a print-quality one).

This gives us a much better picture already. Now we can reduce some of the effects of despeckling by going for a softer-focussed image, for this we use one of GIMP’s most useful tools, the Gausian Blur, with a smallish radius, it just softens the focus enough to smooth out the picture without losing so much detail as to harm it – in fact, it makes it prettier.

Now we want to deal with the background, ripping it out is possible but will cost us all context, instead we just want to remove it’s eye-pulling effect while maintaining it’s presence, so it provides atmosphere rather than interference. I started by removing the drummer’s half visible head (as it contributed nothing) using the clone tool (I cover this in detail below), now for the fun part though – selecting the background outside the guitarist. Gimp offers an easy way to select complex shapes using the Path tool by clicking along the contours you can gradually select out your target.

Now we use Select|From Path to turn our path into a selection. You can also use the lasso tool for this, but only if you have a very steady hand. We will begin by reducing the coloring of the background making it effectively a semi-black-and-white image, thus reducing it’s distracting effect a great deal and making it nice for atmosphere. First we need to adjust our selection a little, we start by shrinking it until it is just inside the border of the guitarist (Select|Shrink Selection) – so we’ll have a smooth transition, then to smooth things further, we feather it (Select Feather) by about 5 pixels.

We invert the selection to select the background around the guitarist rather than the guitarist himself, we open the Hue/Saturation tool from the colors menu.  and drop the saturation a good deal – this gradually reduces the coloring of the background, until we get a near black-and-white effect with just enough color to still look nice.

Below, you can see the results, already an improvement, but there’s a catch – look at the guitar cable, it’s now much brighter on the player’s leg, and thus causes a problem, luckilly the way it runs, if we remove it, it will just appear that it ran behind him – so we have an easy answer, just get rid of it.

For this we use gimp’s clone tool. Cloning lets us take one part of the image and paint over another with it, it’s a powerful tool but at it’s most basic, using a feather-edged brush of about 35-pixel diameter and the aligned mode, it lets us quickly paint out the cable using the surrounding denim.

Now all we do is another small gausian blur to remove the edges from the changes we made, and here we have our final result, it’s pretty, it’s got atmosphere and we’ve turned a bad-luck shot into a work of art.

Aug 282009
 

As I left after my vacation, fully intent to pursue photography all out, my dad gave me some copies of programs he uses day to day (as one who works primarily on Windows). One of which is called “RescuePro” – a program Sandisk makes freely available (as in beer) with their SD cards (though many shops neglect to include the CD’s).

The purpose of this program was, he said, to be able to recover files deleted from your SD card either by accident or because of a software bug. I hadn’t had the opportunity to look into this issue and since my new camera is not yet acquired and I was just working with pix already on my hard-drive, I haven’t had a need yet. As it turns out, fortune smiled on me and in my RSS feeds this morning was a blogpost shared by Karl Fischer covering a GNU/Linux tool for the same purpose.

It seems this is a frequent need for photographers so it’s as well to be prepared and good to know that as a GNU/Linux using photographer you don’t have to miss out on anything. I know I promised a post on touch-ups but that will take some time still, in the meantime – this was a piece of evidently appropriate information that I felt deserved a post in this series.

So I would highly recommend ensuring that you have PhotoRec installed on your machine as part of setting up your digital darkroom. The post in question (which also runs through it’s use very nicely) and has all the required information is here.