As what as you like
-
Terminator 0.2 released
Two days in a row! This is not going to be a continuous thing, but since it’s the weekend I have been hacking on Terminator a lot. This is a big release for me, it finally brings in one of the crucial features required to make this more than just a script for 4 terminals in a window - you can now split terminals on demand. Right click on one and you can turn it into two terminals, horizontally or vertically. My roadmap currently is to have 0.3 allow you to remove terminals. 0.4 will then concentrate on loading/saving some kind of profile so you don’t have to do a complex splitting procedure each time you start Terminator. I’m not sure if many other features will get in between 0.4 and 1.0, because there is lots to do on the gconf and gnome-terminal emulation. Head on over to Terminator’s page for various links, including the download link.

-
Terminator 0.1 released
I’ve just pushed out the first release ever of Terminator, a python script to make a window have multiple Terminals in it. It’s still very rough around the edges. And the middle. But it’s there! Rather than repeat myself here, just click over to the Terminator page for full details.
-
Firefox bad for Linux?
Firefox is a very popular piece of software. Claims run up to 100 million users, which is really good and on the whole I think it’s a very good browser. However. What Firefox isn’t, is integrated. Sure it renders using gtk (and Cairo, if not already then soon) and gnome actions involving URLs spawn Firefox, but it’s still trapped away in its own little universe - Marc Andreeson’s gift to the world, a platform agnostic application architecture. Clearly Mozilla has built itself a highly capable cross-platform application architecture, but that necessarily isolates them on every platform. The trigger behind this post is the patches that recently appeared to let Epiphany use Webkit (Apple’s fork of KHTML, as used n Safari). Epiphany isn’t a bad browser, but it’s not flexible like the fox (purely because there aren’t enough extensions). The problem here is that if GNOME is going to achieve the online desktop integration they have been talking about, reliable HTML widgets seem quite vital. GtkMozEmbed (I say having never used it) appears to be very painful to work with. A high quality GNOME widget based on Webkit that makes displaying HTML really easy would be so extraordinarily useful to the project. It would allow the browser to disappear into the desktop - want to visit a page? click/press something to type some stuff which is an address or search keywords. Out slides the appropriate web page. It gets rid of the necessity to go Applications->Internet->Firefox before typing a URL (and yes I know things like deskbar can launch a browser in these circumstances). Mostly it massively lower the barrier to writing apps which partly rely on the internet, or HTML in general, which can only be a good thing for a more online world. What’s holding it back though is Firefox. It’s a very popular piece of software, even on Windows. Maybe too popular, if Ubuntu were to drop Firefox by default in favour of an integrated future version of Epiphany it could hurt Ubuntu - one of its selling points is no longer that it uses the much vaunted Firefox thingy people have heard of. (I also wonder if GTK should support CSS ;)
-
hacky root partition resizing
How would you shrink the root file system of a remote machine? Of course the easy answer is to boot into a rescue environment and do it (because you can’t shrink ext3 online). If you have a good KVM or ILO setup, you already have a rescue environment of sorts - initramfs. Chuck “break=mount” on your kernel commandline and the initramfs will drop out to a shell before it mounts the root filesystem. You can now mount the root fs manually and copy out the required tools/libs (e2fsck, resize2fs, fdisk and their libraries, in this case), then unmount the root fs. Now, with appropriate $LD_LIBRARY_PATH mangling you can run the extracted binaries and operate on your root partition with impunity
-
Computing nostalgia
It’s pretty much exactly a decade since I started using Linux, so it seems like a good time to look back at what I used to use before. Immediately prior to jumping into the FOSS world, I was using Windows 98, but I don’t really want to talk about that because I never really liked it and it hated my hardware, so it was a very brief partnership. The 7 or 8 years before that though, were computing heaven because I was a devoted Amiga user. Initially I was using an A500, which I added a second floppy drive to (I think the Cumana drive I bought cost me about £80!), as well as a couple of MB of Fast RAM (some of which I hacked into being Chip RAM for better graphics). Eventually the 500 was getting far too restrictive and even my 2-disk boot environment was getting hard to live with, so I got a job in a supermarket to earn some money to buy a shiny new A1200, which was a pretty big leap forward over the 500. After a while I put the much faster 68030 CPU in it (thanks to phase5’s excellent 1230 IV expansion card), a 16MB SIMM and a 120MB 2.5” hard disk. Later I swapped the 030 card for an 040 card, for even more blazing performance. Anyway, enough boring hardware reminiscing, on to the fun stuff! For a while now I’ve wanted to rescue everything on the last Amiga hard disk I owned (a Western Digital 1.2GB monster!), but since my A1200 had something of a small accident (here’s a tip kids, never use the inside of a computer as a footrest) that wasn’t going to be hugely easy. Had I not broken the 1200, things would have been fine - by the time I stopped using the Amiga it had an Ethernet interface and a fair whack of UNIX programs on it like scp. A few months back I fished the disk out of the remains of the Amiga (now forever consigned to the past, as I took the carcass to the local dump), hooked it up to an external USB-IDE interface and took a raw image of the disk. I then bought Amiga Forever, a distribution of various Amiga Emulators and a pretty much complete set of officially licenced ROMs and system disks (lacking working hardware there was no way I could get dumps of my ROMs or transfer the contents of the PC-incompatible floppy system disks). I briefly dallied with the included emulator for UNIX (the venerable UAE), but it was pretty unstable and on further investigation it turns out that most of the development work these days goes into the Windows fork (WinUAE). This was quite disappointing and I never really looked into it all further. That was, until last night when I started tidying up all the crap on my desktop and got to the Amiga Forever folder. The pangs of nostalgia grabbed me again and I decided to have another stab at things. This time I used e-UAE, another fork of UAE, maintained by Richard Drummond (any Amiga user will recognise that name). He has been diligently pulling in the improvements from WinUAE, and it really shows. It’s much more stable than vanilla UAE (although I can still provoke it into crashing). This was a good start, but I was still left with the problem of how to extract the data from the disk image I had. After battling with the uae configs a little, I discovered that there was something wrong - I could only persuade the Amiga to see 1 of the 4 partitions. Fortunately it was the one with all my data on - except my old programming stuff, but the point of this exercise was not to rescue data as I had copied the stuff I really cared about off before I stopped using it. The point was to get *my* Amiga running again, even if the hardware was now just some software. I conversed with some of the long time Amiga stalwarts I still converse with on IRC and one of them pointed me at some really simple code to extract partitions from an Amiga disk image. This proved to be part of the key to making everything Just Work™. The other part being that Linux can read AFFS formatted partitions. I quickly mounted them and pointed e-UAE at the mountpoints and bam! off it went. Ok so I had to spend a few minutes hacking out the various hardware hacks I had from the Startup-sequence, but with that done, I was left with a pretty much exact copy of what I used to use 10 years ago. It’s a very strange experience, leaping back in time like this. You look over your old code, email, pictures and so on and while one part of you thinks “hey I remember this!”, another part things “damn what was I thinking” ;) As Jamie Zawinski found when he tried to do a similar (but unfortunately for him, much more painful) operation a while back, the best way to keep data from being obsoleted is to keep it on a live computer. Sooner or later all hardware fails, but if you always transfer all of your data from one computer to your new one, you’ll never have a huge gap to cross (this is exactly how speciation works, by the way). Emulation and FOSS suggest that there is no real reason why my Amiga now can’t live on forever, virtually. That’s hardly the hugest achievement of mankind, but it makes me happy. I’d like to say thank you to everyone who made the Amiga, everyone who made its community such a fantastic place, and everyone who still works on making it live on. (As a side note, this all serves to make me think what a natural predecessor to the current Linux ecosystem the Amiga was. It had a powerful shell, a friendly GUI, but most crucially, an active and dedicated community)