Category Archives: ubuntu

Ubuntu Lucid Lynx great as ever, no game changer

I’ve upgraded my laptop to Ubuntu Lucid Lynx, and I’m using it to type this post. Ubuntu Lucid Lynx is a “long term support” edition,  making it suitable for businesses. The upgrade from Karmic, the previous version, went relatively smoothly. I say relatively because my laptop is dual boot and has two hard drives. For some reason Grub, the Ubuntu boot utility, always detects the partitioning incorrectly, so when I first start up after an upgrade it cannot find the drive. I have to hit “e” for edit, correct the reference to the boot partition, and then fix Grub’s menu.lst once I am back in.

That aside, all went well, Compiz didn’t break and I still have wobbly windows – a fun graphic effect that I have only seen on Linux.

I would recommend Ubuntu to anyone, provided that they can cope with occasional forays into menu.lst and the like. I cannot think of everyday tasks which are not easily accomplished on Ubuntu. Performance is excellent, and it feels a little faster than Windows 7 on this oldish Toshiba laptop. Considering the cost, it is a fantastic bargain for both home and business users. No Windows tax, no Apple tax, no Microsoft Office tax.

There are a couple of other issues though that continue to hold it back. One is what I can best describe as lack of polish. Part of the reason is that less money is spent on design; Linux looks less home-made that it once did, but put Ubuntu’s new Music Store (an extension to Rhythmbox) alongside Apple’s iTunes and the difference is obvious. Personally I prefer Rhythmbox, but for looks there is no contest.

Another problem is application availability. Many major Window applications such Microsoft Office can be made to work on Ubuntu via the Wine non-emulator, but it is not ideal. It’s certainly a problem for the work I do. I’m about to spend some time with Adobe’s Creative Suite, for example, which I could not do in Ubuntu.

One thing that will help drive Ubuntu and Linux adoption on the desktop is cloud computing. I have a separate blog post coming on this; but Microsoft’s new Office Web Apps could help considerably in mixed Linux/Windows networks. Specifically, I noticed that a Word Open XML document (.docx) which lost its formatting in Open Office, the suite supplied with Ubuntu, worked fine in Word Web App accessed with Firefox. Cloud and web-based computing goes a long way towards solving the application problem.

I like Ubuntu very much, but I don’t expect it to dent Windows or Mac sales any time soon.

QCon London 2010 report: fix your code, adopt simplicity, cool .NET things

I’m just back from QCon London, a software development conference with an agile flavour that I enjoy because it is not vendor-specific. Conferences like this are energising; they make you re-examine what you are doing and may kick you into a better place. Here’s what I noticed this year.

Robert C Martin from Object Mentor gave the opening keynote, on software craftsmanship. His point is that code should not just work; it should be good. He is delightfully opinionated. Certification, he says, provides value only to certification bodies. If you want to know whether someone has the skills you want, talk to them.

Martin also came up with a bunch of tips for how to write good code, things like not having more than two arguments to a function and never a boolean. I’ve written these up elsewhere.

image

Next I looked into the non-relational database track and heard Geir Magnusson explain why he needed Project Voldemort, a distributed key-value storage system, to get his ecommerce site to scale. Non-relational or NOSQL is a big theme these days; database managers like CouchDB and MongoDB are getting a lot of attention. I would like to have spent more time on this track; but there was too much else on; a problem with QCon.

I therefore headed for the functional programming track, where Don Syme from Microsoft Research gave an inspiring talk on F#, Microsoft’s new functional language. He has a series of hilarious slides showing F# code alongside its equivalent in C#. Here is an example:

image

The white panel is the F# code; the rest of the slide is C#.

Seeing a slide that this makes you wonder why we use C# at all, though of course Syme has chosen tasks like asychronous IO and concurrent programming for which F# is well suited. Syme also observed that F# is ideal for working with immutable data, which is common in internet programming. I grabbed a copy of Programming F# for further reading.

Over on the Architecture track, Andres Kütt spoke on Five Years as a Skype Architect. His main theme: most of a software architect’s job is communication, not poring over diagrams and devising code structures. This is a consistent theme at QCon and in the Agile movement; get the communication right and all else follows. I was also interested in the technical side though. Skype started with SOAP but switched to a REST model for web services. Kütt also told us about the languages Skype uses: PHP for the web site, C or C++ for heavy lifting and peer-to-peer networking; Delphi for the Windows interface; PostgreSQL for the database.

Day two of QCon was even better. I’ve written up Martin Fowler’s talk on the ethics of software development in a separate post. Following that, I heard Canonical’s Simon Wardley speak about cloud computing. Canonical is making a big push for Ubuntu’s cloud package, available both for private use or hosted on Amazon’s servers; and attendees at the QCon CloudCamp later on were given a lavish, pointless cardboard box with promotional details. To be fair to Wardley though, he did not talk much about Ubuntu’s cloud solution, though he did make the point that open source makes transitions between providers much cheaper.

Wardley’s most striking point, repeated perhaps too many times, is that we have no choice about whether to adopt cloud computing, since we will be too much disadvantaged if we reject it. He says it is now more a management issue than a technical one.

Dan North from ThoughtWorks gave a funny and excellent session on simplicity in architecture. He used pseudo-biblical language to describe the progress of software architecture for distributed systems, finishing with

On the seventh day God created REST

Very good; but his serious point is that the shortest, simplest route to solving a problem is often the best one, and that we constantly make the mistake of using over-generalised solutions which add a counter-productive burden of complexity.

North talked about techniques for lateral thinking, finding solutions from which we are mentally blocked, by chunking up, which means merging details into bigger ideas, ending up with “what is this thing for anyway”; and chunking down, the reverse process, which breaks a problem down into blocks small enough to comprehend. Another idea is to articulate a problem to a colleague, which exercises different parts of the brain and often stimulates a solution – one of the reasons pair programming can be effective.

A common mistake, he said, is to keep using the same old products or systems or architectures because we always do, or because the organisation is already heavily invested in it, meaning that better alternatives do not get considered. He also talked about simple tools: a whiteboard rather than a CASE tool, for example.

Much of North’s talk was a variant of YAGNI – you ain’t gonna need it – an agile principle of not implementing something until/unless you actually need it.

I’d like to put this together with something from later in the day, a talk on cool things in the .NET platform. One of these was Guerrilla SOA, though it is not really specific to .NET. To get the idea, read this blog post by Jim Webber, another from the ThoughtWorks team (yes, there are a lot of them at QCon). Here’s a couple of quotes:

Prior to our first project starting, that client had already undertaken some analysis of their future architecture (which needs scalability of 1 billion transactions per month) using a blue-chip consultancy. The conclusion from that consultancy was to deploy a bus to patch together the existing systems, and everything else would then come together. The upfront cost of the middleware was around £10 million. Not big money in the grand scheme of things, but this £10 million didn’t provide a working solution, it was just the first step in the process that would some day, perhaps, deliver value back to the business, with little empirical data to back up that assertion.

My (small) team … took the time to understand how to incrementally alter the enterprise architecture to release value early, and we proposed doing this using commodity HTTP servers at £0 cost for middleware. Importantly we backed up our architectural approach with numbers: we measured the throughput and latency characteristics of a representative spike (a piece of code used to answer a question) through our high level design, and showed that both HTTP and our chosen Web server were suitable for the volumes of traffic that the system would have to support … We performance tested the solution every single day to ensure that we would always be able to meet the SLAs imposed on us by the business. We were able to do that because we were not tightly coupled to some overarching middleware, and as a consequence we delivered our first service quickly and had great confidence in its ability to handle large loads. With middleware in the mix, we wouldn’t have been so successful at rapidly validating our service’s performance. Our performance testing would have been hampered by intricate installations, licensing, ops and admin, difficulties in starting from a clean state, to name but a few issues … The last I heard a few weeks back, the system as a whole was dealing with several hundred percent more transactions per second than before we started. But what’s particularly interesting, coming back to the cost of people versus cost of middleware argument, is this: we spent nothing on middleware. Instead we spent around £1 million on people, which compares favourably to the £10 million up front gamble originally proposed.

This strikes me as an example of the kind of approach North advocates.

You may be wondering what other cool .NET things were presented. This session was called the State of the Art .NET, given by Amanda Laucher and Josh Graham. They offer a dozen items which they considered .NET folk should be using or learning about:

  1. F# (again)
  2. M – modelling/DSL language
  3. Boo – static Python for .NET
  4. NUnit – unit testing. Little regard for Microsoft’s test framework in Team System, which is seen as a wasted and inferior effort.
  5. RhinoMocks – mocking library
  6. Moq – another mocking library
  7. NHibernate – object-relational mapping
  8. Windsor – dependency injection, part of Castle project. Controversial; some attendees thought it too complex.
  9. NVelocity – .NET template engine
  10. Guerrilla SOA – see above
  11. Azure – Microsoft’s cloud platform – surprisingly good thanks to David Cutler’s involvement, we were told
  12. MEF – Managed Extensibility Framework as found in Visual Studio 2010, won high praise from those who have tried it

That was my last session (I missed Friday) though I did attend the first part of CloudCamp, an unconference for cloud early adopters. I am not sure there is much point in these now. The cloud is no longer subversive and the next new thing; all the big enterprise vendors are onto it. Look at the CloudCamp sponsor list if you doubt me. There are of course still plenty of issues to talk about, but maybe not like this; I stayed for the first hour but it was dull.

For more on QCon you might also want to read back through my Twitter feed or search the entire #qcon tag for what everyone else thought.

Ubuntu Linux: the agony and the ecstasy

Just after writing a positive review of Ubuntu Karmic Koala I noticed this piece on The Register: Early adopters bloodied by Ubuntu’s Karmic Koala:

Blank and flickering screens, failure to recognize hard drives, defaulting to the old 2.6.28 Linux kernel, and failure to get encryption running are taking their toll, as early adopters turn to the web for answers and log fresh bug reports in Ubuntu forums.

Did I get it wrong? Should I be warning users away from an operating system and upgrade that will only bring them grief?

I doubt it, though I see both sides of this story. I’ve been there: hours spent trying to get Bluetooth working on the Toshiba laptop on which I’m typing; or persuading an Asus Eee PC to connect to my wi-fi; or running dpkg-reconfigure xserver-xorg to try to get Compiz working or to escape basic VGA; or running Super Grub to fix an Ubuntu PC that will not boot; or trying to fix a failed migration from Lilo to Grub 2 on my Ubuntu server.

That said, I noticed that the same laptop which gave me Ubuntu Bluetooth grief a couple of years ago now works fine with a clean install, Bluetooth included. It’s even possible that my own contribution helped – that’s how Linux works – though I doubt it in this case.

I also noticed how Ubuntu 9.10 has moved ahead of Windows in several areas. Here are three:

  1. Cloud storage and synchronization

    Microsoft has Live Mesh. Typical Microsoft: some great ideas, I suspect over-engineered, requires complex runtime to be downloaded and installed, not clear where it fits into Microsoft’s overall strategy, still in beta long after it was first trumpeted as a big new thing. So is this thing built into Windows 7? No way.

    By contrast Ubuntu turns up with what looks like a dead simple cloud storage and synchronization piece, web access, file system access, optional sharing, syncs files over multiple computers. Ubuntu One. I’ve not checked how it handles conflicts; but then Mesh was pretty poor at that too, last time I looked. All built-in to Karmic Koala, click, register, done.

  2. Multiple workspaces

    Apple and Linux have had this for years; I have no idea why it isn’t in Windows 7, or Vista for that matter. Incredibly useful – if the screen is busy but you don’t fancy closing all those windows, just switch to a new desktop.

  3. Application install

    This is so much better on Linux than on Windows or Mac; the only platform I know of that is equally user-friendly is the iPhone. OK, iPhone is better, because it has user ratings and so on; but Ubuntu is pretty good: Software Centre – browse – install.

I could go on. Shift-Alt-UpArrow, Ubuntu’s version of Exposé, very nice, not on Windows. And the fact that I can connect a file explorer over SSL using Places – Connect to server, where on Windows I have to download and install WinScp or the like.

Plus, let’s not forget that Ubuntu is free.

Of course you can make a case for Windows too. It’s more polished, it’s ubiquitous, app availability is beyond compare. It is a safe choice. I’m typing this on Ubuntu in BlogGTK but missing Windows Live Writer.

Still, Ubuntu is a fantastic deal, especially with Ubuntu One included. I don’t understand the economics by which Canonical can give everyone in the world 2GB of free cloud storage; if it is hoping that enough people will upgrade to the 50GB paid-for version that it will pay for the freeloaders, I fear it will be disappointed.

My point: overall, there is far more right than wrong with Ubuntu in general and Karmic Koala in particular; and I am still happy to recommend it.

Ubuntu Karmic Koala breaks Squeezeboxserver

I have an Ubuntu server performing various important duties including serving music for Squeezebox. It was humming along with version 9.04 of Ubuntu and the latest Logitech Squeezeboxserver; but a new version of Ubuntu, 9.10 or Karmic Koala, was released today and I hastened to install it.

All went well – aside from a problem with Grub 2 which is related to my slightly unusual setup – except that Squeezebox Server failed after the upgrade completed. When I tried to use aptitude to correct the problem, I saw an error message:

The following packages have unmet dependencies.

squeezeboxserver: Depends: mysql-server-4.1 but it is not installable or

mysql-server-5.0 but it is not going to be installed

Frustrating, particularly as this thread indicates that squeezebox server runs fine with MySQL 5.1, which is installed.

I messed around trying to get apt-get to force the install but it would not play. I therefore downloaded the .deb directly and ran the following command:

dpkg –i –-force-all squeezeboxserver_7.4.1_all.deb

This tells dpkg to install the package come what may. It did so; and everything works fine.

Update: Andy Grundman tells me that the problem is fixed in Squeezebox Server 7.4.2, currently in beta.

Ubuntu 9.04 not so jaunty

I still love Ubuntu, but it’s hard to find much to enthuse about in the latest release, 9.04 also known as Jaunty Jackalope. As this post observes, most of the changes are under the hood, so users will not notice much difference from the previous release, Intrepid Ibex or 8.10. Well, there’s faster start-up, and Open Office 3.0 – but then again, I installed Open Office 3.0 as soon as Intrepid came out, so this is not really exciting.

My own upgrade went better than the last one, but I’ve still had problems. Specifically:

  • I had to edit Grub’s menu.lst manually after the upgrade. I always have to do this, since it detects the hard drive configuration incorrectly.
  • My Adobe AIR installation was broken and had to be re-installed
  • I’ve lost hardware graphics acceleration and desktop effects. This is a laptop with embedded Intel graphics; apparently this is a common problem and Intel graphics support in Jaunty is work in progress. See here for more details an experimental suggested fix, which is not for the faint-hearted.

There are other updates, of course, and I was glad to see Mono 2.0.1 and MonoDevelop 2.0 available in the repository, for .NET development on Linux. If Jaunty is the same as before, but faster and more stable, that is no bad thing, though the shaky Intel graphics support undermines that argument.

My question: why is Canonical persevering with its policy of supposedly major releases every six months? This looks to me like a minor update; would it not be better presented as updates to 8.10, and focusing efforts on 9.10 in October? Six-monthly releases must be a heavy burden for the team.

I don’t mean to put you off Ubuntu. It is well worth trying either as a companion or alternative to Windows and Mac.

Update:

I have fixed my desktop effects. How? First, a little more about the problem. DRI (Direct Rendering Infrastructure) was not enabled. My graphics card (from lspci –nn | grep VGA) is:

Intel Corporation Mobile 945GM/GMS, 943/940GML Express Integrated Graphics Controller [8086:27a2] (rev 03)

The problem I had before was reported in Xorg.0.log as:

Xorg.0.log:(EE) intel(0): [dri] DRIScreenInit failed. Disabling DRI.

I also noticed that /dev/dri/card0 did not exist on my system.

Well, I tried the technique described here. That is, I booted into an older version of the kernel; the oldest available on my system being 2.6.22.14. DRI magically started working. Then I rebooted into the latest version of the kernel, 2.6.28.11. DRI still works. So I am sorted. I’d be interested to know why this works.

Reasons to love Linux #1: package management

I posted recently about a difficult Ubuntu upgrade, drawing the comment “What do you prefer to do on Linux that you don’t on Windows?”

Today I patched the Debian server which runs this blog. APT upgraded the following applications:

MySQL 5

Apache 2.2

Clam AntiVirus

Time Zone data (tzdata)

Some of these involve several packages, so 16 packages were updated.

Bear in mind that this is a running system, and that MySQL and Apache are in constant heavy use, mostly by WordPress.

I logged on to the terminal and typed a single command:

apt-get upgrade

The package manager took less than a minute to upgrade all the packages, which had already been downloaded via a scheduled job. Services were stopped and started as needed. No reboot needed. Job done.

I guess a few people trying to access this site got a slow response, but that was all.

Now, how long would it take to upgrade IIS, SQL Server and some server anti-virus package on Windows? What are the odds of getting away without a restart?

Admittedly this is not risk-free. I’ve known package management to get messed up on Linux, and it can take many hours to resolve – but this usually happens on experimental systems. Web servers that stick to the official stable distribution rarely have problems in my experience.

I realise that the comment really referred to desktop Linux, not server, and here the picture is less rosy. In fact, this post was inspired by a difficult upgrade, though in this case it was the entire distribution being updated. Even on the desktop though, the user experience for installing updates and applications is generally much better.

Let’s say I’m looking for an image editor. I click on Add/Remove and type a search:

I like the way the apps show popularity. I’d like a few more things like ratings and comments; but it’s a start. Inkscape looks interesting, so I check it, click Apply Changes, and shortly after I get this dialog:

I double-click, and there it is:

I admit, I did take a few moments to download an example SVG file from the W3C, just to make the screen grab look better. But provided you have broadband, and the app you want is in the list, it is a great experience.

Windows Vista has had a go at this. From Control Panel – Programs and Features you can get to Windows Marketplace, where you might search and find something like The Gimp (free)  or Sketsa SVG Editor (costs). I tried The Gimp, to be more like-with-like. I had to sign in with a Live ID even though it is free. I went through several web dialogs and ended up with a download prompt for a zipped setup. That was it.

In other words, I went through all these steps, but I still do not have The Gimp. OK, I know I have to extract the ZIP and run the setup; but Ubuntu’s Add/Remove spares me all that complication; it is way ahead in usability.

App Store on the iPhone also has it right. For the user, that is. I detest the lock-in and the business model; but usability generally wins. The online stores on games consoles, like XBox Live Marketplace, are good as well. I guess one day we will install or buy most applications this way.

A painful upgrade to Ubuntu Intrepid Ibex

I’m writing a piece on Ubuntu – makes a change from all that Windows at Microsoft’s PDC. I wanted to be up-to-date, so I upgraded my laptop from Hardy Heron (8.4) to Intrepid Ibex (8.10), released just yesterday. I followed the officially recommended procedure. Currently I only have a wi-fi connection, which is not ideal, but I reckoned it might work. Before upgrading, I applied all available updates to the existing 8.04 installation.

The update manager started off confidently enough, though it sat for a long time on ldconfig deferred processing. Then it asked for a restart, and things started going wrong. Ubuntu could only boot to a terminal prompt, since it was missing packages needed for X, the graphical server, to start. I tried to fix this with apt-get; but I had another problem: the wifi connection was down. I managed to get this working with ifconfig and iwconfig, and repaired my system with apt-get update and apt-get dist-upgrade. This downloaded and installed some 340MB of packages, after which I could boot to the desktop.

I was not done yet. On startup, Ubuntu was pausing when configuring the network. When the desktop appeared, I had the problem usually expressed as nm-applet not appearing in the panel. This actually meant that the network manager had crashed. If I tried to restart it, it said “no connections defined” and hung with some other errors. Once again, I could only restore wifi by fidding with console commands. I discovered I was not alone with the nm-applet problem. The fix that worked for me was to remove all references to network devices other than loopback in /etc/network/interfaces, as described here. Restarted, the network applet returned, and I could finally connect conveniently.

I got a surprise when I tried to browse the web. The upgrade had removed most of my applications, including FireFox and OpenOffice. I had to reinstall these using Add/Remove applications. I did find that FireFox had remembered my settings, once reinstalled, for which I was grateful.

Now that Intrepid Ibex is up and running, it will probably be as stable, fast and capable as Hardy Heron before it – really, it was. Linux is great, honest.

Eee 901 problems – does Asus still care about Linux?

I am reviewing the Asus Eee PC 901, the one with the Intel Atom processor. Of course I asked to see the Linux version. In my view Linux is better suited than Windows for a device with limited storage; and it is more interesting to me since the original Eee PC 701 was something of a breakthrough for desktop Linux.

No problem with the hardware; but the OS is a bit of a mess. The first problem is that the wireless card does not work properly for me. Asus have used a less common Ralink card – maybe it saved a few pennies over the Atheros it used to have – but out of the box it is not set up right. When I try to connect with WPA encryption I get:

Error for wireless request “Set Frequency” (8B04)
SET failed on device ra0; network is down
ioctl[SIOCSCIWAUTH]: Operation not supported

Looks like an update is needed. Here’s where the big problems start. With the 701 I had no problems updating, whether using the Synaptic GUI, or apt-get in a console. The new Eee currently offers me two updates in its “Updates and New Software” applet, one for “StarOffice Mime Types” which installs fine, and the other for “Webstorage Update”, which fails. Click Details and it is blank; no error message.

Trying apt-get instead is equally frustrating. Thanks to dependencies, updating almost any package results in a huge download – taking over an hour over broadband. Then the update fails because it runs out of disk space. That, and some packages are returning a 404; I also got size mismatch errors. Note: use apt-get clean after one of these exercises as that will free disk space.

The fact is, update is broken. One solution is not to update – though security is always a concern – but that still leaves the wireless problem unsolved.

This is careless of Asus. Part of the idea with the Eee is that it is an appliance, it just works, it hides all that Linux gunk. Except it is failing to do so, because of errors in the package management. Here’s what one user says:

This is sad. The thing that really helped launch the original 701 into reality is gone, and that’s Linux…I know my way around computers, and I know where to look to fix stuff, but this would leave a horrible taste in anyone’s mouth that wasn’t accustomed to finessing Linux (that’s the nice way of saying it)…I can’t say I see much of a future for Linux on the Eee.

It’s early days for the 901; maybe it will all be fixed soon. Still, at the very least it is being pushed out before the software is ready; which is a shame because there is a lot to like as well.

The best advice for those who don’t mind tweaking may be to install Ubuntu or some other distribution.

Update: I fixed the wi-fi issue eventually – see here.

Technorati tags: , ,

What’s new in Subversion 1.5

The team behind the open source SCM (Software Configuration Management) tool Subversion released version 1.50 last month. Karl Fogel, president of the Subversion Corporation, says:

Measuring by new features alone, Subversion 1.5 is our biggest release since version 1.0 became available in February 2004.

I am a contented Subversion user, so took a look at the changes. Top of the list is “merge tracking”, though it is described as “foundational” which means that although the basic support is there, there is performance and feature work which remains to be done. From the user’s perspective, the difference is that branching and merging is just easier than before, as explained by Ben Sussman:

Notice how I never had to type a single revision number in my example: Subversion 1.5 knows when the branch was created, which changes need to be synced from branch to trunk, and which changes need to be merged back into the trunk when I’m done. It’s all magic now. This is how it should have been in the first place.

Other changes include sparse checkouts (when you only want to grab a small part of a repository), and changelists, a client feature which lets you tag a set of files under a changelist name and work on them as a group. There are also improvements aimed at making Subversion better suited to large-scale deployments using multiple servers. Subversion is still a centralized rather than a distributed SCM system, but 1.5 is better suited for use in a distributed manner. No doubt the Subversion team is aware of the increasing interest in Git, a distributed system. There are also numerous bug-fixes and performance tweaks. The changes are described here.

I want to move to Subversion 1.5 but it is not that easy. Compatibility is good, in that older clients work with 1.5 servers and vice versa, the main proviso being that you cannot mix several versions of the Subversion client with the same working copy. That is not likely to be a problem for most users.

The difficulty I encountered is that mainstream Linux distributions still have older versions of Subversion in their stable repositories. Ubuntu, for example, has version 1.4.4. My most-used Subversion repositories are on a Debian server, which also has an old version. I don’t want to switch the server to sid, Debian’s development distribution, and mixing packages is often problematic. I could do a manual installation I guess; but instead I will be patient.

I did install 1.50 on Windows, for an intranet repository. I used the Collabnet download. All I needed to do was to inform the installer of the location of my existing repository, and then copy a few lines from the old Apache 2.0 configuration file to the new Apache 2.2 configuration file. Everything works fine. I also updated TortoiseSVN on the Windows clients.

One of the advantages of Subversion (or any SCM) repositories over synched folders like those in Microsoft’s Live Mesh or Apple’s MobileMe (as I understand it) is that you get version history. I regard this as a key feature. The problem with synchronization is that you might overwrite a good copy with a bad one. It is easy to do; it might be caused by user error, or a bug in your word processor, or a failing hard drive. Automatic synch (un)helpfully replicates the bad copy all over. Versioning means you can just rollback to the good one.

What to say about Ubuntu Hardy Heron?

I installed Ubuntu Hardy Heron, a “long term support” release which went final yesterday.

It’s a tricky thing to assess. There are in general two things to say about Linux. First, you can take the line that it is a wonderful thing: free, fast, responsive and capable. You can do your work on this, even run a business on it. You can write applications in Java, C# or any number of other languages. You can have fun with it too – it’s great for multimedia, just a shame that few games support it. Finally, it is nice to know that most of the world’s malware is targetting someone else’s operating system.

Alternatively, you can argue that Linux is fiddly, perplexing, over-complicated, inconsistent, and still not ready for the general public.

It is tempting to give Ubuntu an easy ride because it is free and because we so much want it to succeed; we need an alternative to the Microsoft tax or the Apple tax. Unfortunately you never have to look far to find little problems or things that should be easy but end up consuming considerable effort.

Here’s one thing I noticed today. Close FireFox. Open  the Help Centre, and click a web link. The Help Centre opens FireFox with the link you requested, but then cannot be used until you close the FireFox instance. Trying to close it brings up a “Not responding” message. If FireFox was already running when you clicked the link, it is fine.

Here is another. Open Help Centre, click Playing Music, then Listen to online audio streams. It says I can install Real Player 10 and that it is available from the “commercial respository”. What is the “commercial” repository? This page describes four Ubuntu repositories: main, restricted, universe and multiverse. Real Player is not in any of them. Further, if you try and install it using apt get, the following message appears:

Package realplayer is not available, but is referred to by another package. This may mean that the package is missing, has been obsoleted, or is only available from another source
E: Package realplayer has no installation candidate

Hey, it’s Linux. Just Google and you’ll find a way. Who needs Real Player anyway? But that’s not the point … the point is that these little issues crop up and make running Linux less fun for non-geeks.

Here’s another one: I tried GNU Chess. I poked around in Preferences and chose the 3D view. It said:

You are unable to play in 3D mode due to the following problems:
No Python OpenGL support
No Python GTKGLExt support

Please contact your system administrator to resolve these problems, until then you will be able to play chess in 2D mode.

Fair enough; it is a clear, accurate and informative message – aside from the bit about “contacting your system administrator” which sounds like it was borrowed from Windows. You can just about forgive it in business software, but this is a game.

I still love Ubuntu. This one installed easily and updates nicely; the fancy graphics effects work smoothly; and most important, the same machine which felt slow with Vista now seems more like a high-performance workstation.

In other words, it it easy to support either line of argument. Personally I veer towards the favourable view; but I doubt fear of Ubuntu is keeping anyone in Redmond awake at nights.