How not to ship a hard drive

I ordered a hard drive from and was taken aback by the way it was shipped to me – in a flimsy padded envelope with no additional protection.


In case you are wondering how to ship a hard drive, this is the illustration from the Western Digital support site:


which adds:

Place in sturdy cardboard box. Do not use chipboard, as it is not strong enough to withstand the rigors of transit. Please make sure the corrugated carton is free from defects and is structurally sound. Note: Returning a WD hard drive in an envelope, will void the warranty.

I protested and offered to take the drive back but gave me no explanation for the incorrect packaging. Surprisingly the drive checks out OK, although hidden damage is a concern.

Quick thoughts on Xcode and Objective C versus Microsoft’s tools

I have been trying out JetBrains’ AppCode which meant working in an Apple development environment for a time. I took the opportunity to implement my simple calculator app in iOS native code.


Objective C is a distinctive language with a mixed reputation, but I enjoy coding with it. I used Automatic Reference Counting (ARC), a feature introduced in Xcode 4.2 and OSX 10.7, iOS 5; ARC now also works with 10.6 and iOS 4. This means objects are automatically disposed, and I did not have to worry about memory management at all in my simple app. This is not a complete memory management solution (if there is such a thing) – if you use malloc you must use free – but it meant that the code in my app is not particularly verbose or complex compared to other languages. Apple’s libraries seem to favour plain English method names like StringByAppendingString which makes for readable code.

I was impressed by how easy it is to make an app that looks good, because the controls are beautifully designed. I understand the attraction of developing solely for Apple’s platform.

I also love the integrated source control in Xcode. You find yourself using a local Git repository almost without thinking about it. Microsoft could learn from that; no need for Team Foundation Server for a solo developer.

I did miss namespaces. In Objective C, if you want to remove the risk of name collision with a library, you have to use your own class prefix (and hope that nobody else picked the same one).


Interface Builder, the visual UI designer, is great but many developers do not use it, because coding the UI without it is more flexible. It is a shame that you have to make this choice, unlike IDE’s with “two way tools” that let you edit in code or visually and seamlessly keep the two in synch. I found myself constantly having to re-display windows like the Attributes Inspector though it is not too bad once you learn the keyboard shortcuts. The latest Interface Builder has a storyboard feature which lets you define several screens and link them. It looks useful, though when I played with this I found it difficult to follow all the linking lines the designer drew for me.

It is interesting to compare the Mac and iOS development platform with that for Windows. Microsoft promotes the idea of language choice, though most professional development is either C# or C++, whereas on Apple’s platform it is Objective C and Cocoa or you are on your own. Although Mac and Windows are of a similar age, Microsoft’s platform gives a GUI developer more choices: Win32, MFC, WTL, Windows Forms, Windows Presentation Foundation and Silverlight, and in Windows 8 the new WinRT.

I get the impression that Microsoft is envious of this single-minded approach and trying to bring it to Metro-style Windows 8, where you still have a choice of languages but really only one GUI framework.

That said, Visual Studio is an impressive tool and both C# and C++ have important features which are lacking in Objective C. I would judge that Visual Studio is the more productive tool overall, but Apple’s developer platform has its own attractions.

Moving Windows with its applications: too difficult

I have just replaced my PC – well, if you count new motherboard, new CPU, new hard drive, new RAM as replacement, though it sits in the same case – and faced again the question of what to do with my Windows setup, complete with hundreds of applications.

A few years back, there was no question. You took every opportunity to do a clean install, because without it Windows gradually became unusable, as gloriously recounted by Verity Stob.

Stob’s analysis is not completely wrong today, but the matter has greatly improved. The Windows 7 64-bit installation that I use today was installed in August 2009 (run systeminfo if you want to check yours), and that was an in-place upgrade from Windows Vista 64-bit, as recorded here. That Vista install was done in January 2008, so I have preserved applications and settings for coming up to four years and two motherboard changes.

The trade-off is that in return for putting up with some cruft you get a big win in convenience. There is no need to dig out install media, downloads and licence codes, and migration to a new system is quicker.

So why complain? Well, although it can usually be done, moving Windows from one machine to another is not supported by Microsoft, unless the hardware is identical:

Microsoft does not support restoring a system state backup from one computer to a second computer of a different make, model, or hardware configuration. Microsoft will only provide commercially reasonable efforts to support this process. Even if the source and destination computers seem to be identical makes and models, there may be driver, hardware, or firmware differences between the source and destination computers.

What this means is that users who get a new computer are directed instead towards the Windows Easy Transfer application:


This is a handy tool, but it does not transfer applications. This last point can be particularly tiresome if you use software that requires activation on each machine on which it is installed, not least Microsoft’s own Windows and Office. Adobe’s Creative Suite, for example, allows installation on up to two machines, after which it will no longer install unless you specifically deactivate it:


If you trash your old PC, or it breaks, without deactivating first, then you have to call support and plead your case.

Apple’s Migration Assistant, by contrast, does move applications, making a better user experience.

If you can easily move applications, settings and data, of course, there is no need to move the entire operating system, since you have all that matters.

Why does Microsoft make this so hard? Two reasons I can think of.

One is that there are technical challenges in moving Windows to new hardware; though having said that, I suspect that Microsoft could easily have created a migration wizard that includes applications if it wished to do so.

The second, and more important, is licencing. Most consumer versions of Windows (and Office too) are OEM licences, which are not allowed to be transferred from the machine with which they are supplied. If Microsoft made it easier to move Windows or to migrate applications, less new software would be sold. Enterprises are expected to handle this in a different way, with centralised application management tools.

Virtualisation changes the game of course. The point of virtualisation is that you run the operating system on abstracted hardware that can easily be replicated on another machine. I really would like to run a virtual desktop, but I do not have a suitably high-powered server and there are niggles over fast graphics, USB devices, studio quality audio and so on. I expect all these to be solved and that a virtual desktop is in my future.

In the meantime, I have personally lost patience with the idea of reinstalling everything, and fortunately I do not use OEM Windows licences.

The wider question is interesting though. Although the desire of Microsoft and its partners to protect licence income is understandable, there are new models of application licencing that work better for users. In Google’s world you just sign on in your browser, and all your stuff is there. In Apple’s world, your iOS apps are licenced to you, not your device, and when you get a new device they all reappear. Even Microsoft’s Xbox works like this too, though that was not always the case.

This competition, in combination with virtualisation, means that Microsoft’s approach with Windows looks out of date as well as being unpleasant for users.

Windows 8 is on the horizon, and I would guess that the forthcoming Windows Store will be better in this respect, though note that at its Build conference in September Microsoft did not discuss the business aspects of the Store.

Something has changed for Windows Phone

When Windows Phone 7 launched last year, it was obvious that it could not succeed since it was all-but invisible to most people. In my local small town centre, which has several mobile phone shops, it was nowhere to be seen.

I went out to post a letter just now and was astonished to see this poster in the window of Phones4u:


I went in and discovered only a dummy of the Radar and Titan on display. I asked to see a Titan and they got one out for me to see.

The Nokia Lumia 800 was also on display, this one a working model.


The Titan has a gorgeous large screen, but while it is slightly bulky it is slim and does not feel heavy to hold. I put it alongside the Lumia; the Titan screen does look larger and better. Unfortunately I could not see the Lumia out of its clip. The Lumia does benefit from Nokia Drive (not working because no internet connection) and seems to be around £100 cheaper than the Titan. The Lumia also has the free British Airways app pre-installed.

I asked the assistant what she thought of Windows Phone and she said she had not tried it. I said I had an HTC Desire (true) and she seemed slightly puzzled about why I would want a Windows Phone though she thought it would be good for work because of Office.

Still, Microsoft’s device has visibility at last, though this seems to be more because of moves by Nokia and HTC than from Microsoft itself. If it can win the support and enthusiasm of some of those influential retail assistants we may see significant growth in market share.

Review: JetBrains AppCode for Objective C

I have been trying out JetBrains AppCode, a new IDE for Apple’s Objective C. The company is best known for its IntelliJ IDE for Java, and AppCode essentially takes the same core IDE and reworks it for Objective C. AppCode is itself a Java application, but unless you have a religious objection to this I doubt you will find it a problem: I found it perfectly snappy and responsive on my machine, a 2.3 Ghz Core i5 with 8GB RAM.

Installation was a snap, as Mac users expect.


One thing I discovered immediately is that AppCode is not a replacement for Xcode, the official Apple IDE. The Apple SDKs are delivered with Xcode, and AppCode requires it. An AppCode project is also an Xcode project.


This is particularly important if you want to use Interface Builder, the Xcode visual designer, since AppCode has no equivalent. Double-click the .xib file and it opens in Xcode. This is a disorientating at first, but in practice I found it convenient to be able to switch between the two IDEs.

So why bother with AppCode, when Xcode is free? It is certainly not essential, but my view is that tools which save time or improve quality are worth the investment. Whether AppCode will do this for you will depend on how you work and whether you have any frustrations with Xcode, which improved considerably in version 4. Out of the box, Xcode has integrated Git or Subversion source code control, unit test integration, refactoring including Rename, Extract, and Encapsulate, the aforementioned Interface Builder, and a ton of other features. Sticking with Xcode is a safe choice.

That said, AppCode feels leaner and less cluttered than Xcode. It also has many additional productivity features in the editor. JetBrains’ IDEs are well known for refactoring, and while AppCode is not as rich as IntelliJ IDEA in this respect, it does have a more than Xcode.

Another strong feature is code generation. Press Command + n in the editor, and a context-sensitive Generate menu offers various time-saving options. I like the way I can type a new method in an implementation file, press Alt + Enter, and select Declare method in the interface to add it automatically to the interface file; or type it first in the interface and have it implemented automatically. It pays to learn the keyboard shortcuts

Live templates let you type an abbreviation and expand it to a block of code, which you then tab through to edit. Type for, select the template, press tab, and AppCode will create a for loop; press tab again to edit the variable name and the number of iterations. You can customise and create your own Live templates in the AppCode Preferences dialog.


There are also a ton of performance tools in AppCode [update: note these are links to Xcode tools].  Choose Profile from the Run menu and choose what you want to analyse:


then run your app


You can also do static analysis according to customisable rules.

There is a debugger which works as you would expect including stack trace and variable inspection.

The best thing I can say about AppCode is that it is a pleasure to use. It does not throw up unnecessary dialogs, it works logically, and the tools are easy to use and configure. I have not always found this to be the case with Xcode, and if you spend a significant amount of your time on Objective C development then I recommend grabbing the trial download to discover if it will speed your work.

Post sponsored by Monster for the best in IT Jobs.

Breaking Intel RAID: what happens to your data?

I am upgrading my desktop PC, wondering as I do if this is the last time. I did this four years ago; maybe four years from now cloud, mobile and virtualisation will make this unnecessary.

But I digress. My PC is creaking and I am replacing most of its innards. I use an Intel motherboard with its embedded RAID controller to mirror the data on the main 1TB drive. Since I am now getting new, faster drives, I want to break the RAID. The question though: will this delete all the data?

The puzzle here is that the Intel Matrix Storage Manager insists that when you delete a RAID volume, all its data is lost. But why should you lose data if you are breaking a mirrored (RAID 1) volume? In this configuration, each of two drives maintains identical data. In fact, Intel’s User’s Manual says:

All data on the RAID drives will be lost unless the volume that is selected is a RAID 1 volume.

The utility itself is less comforting though, and when I go to break the RAID it says data will be lost unless it is a “Recovery volume”, which is something slightly different.


Now, if you are like me you set up the RAID in the first place because you would rather not lose that data. On the other hand, my suspicion was that the data would in fact be preserved. Caution prevailed, and I made a Windows system image backup of the entire thing, which took most of a day.

Then I deleted the volume. No data was lost and Windows booted perfectly, though it did reconfigure its storage drive and ask for a restart.

The second drive also retained all the data. Windows made it offline, because according to disk manager:

The disk is offline because it has a signature collision with another disk that is online

which is fair enough.

I can confirm, then, that in my experience you can delete an Intel RAID mirror without losing the data. Still, if this is data you care about, I guess you are going to take a backup anyway before pressing Y.

The power of Google: how the Panda update hit Experts Exchange

Searching Google recently it struck me that I rarely see results from Experts Exchange. I used to see a lot of these, because I typically search on things like error messages or programming issues for which the site is a useful source.

The site is controversial, because it (kind-of) charges for access to its knowledgebase but does not pay its experts. I posted about this back in 2009. That said, the quality of its advice is often good, and most answers are available without payment if you scroll far enough down the page. You can also get free access as an expert if you answer a few queries successfully.

Experts Exchange has to some extent been replaced by the StackOverflow group of websites, which are nicer to use and free, but I have found that the chances of getting your obscure query answered can be higher on Experts Exchange, particularly for admin rather than programming queries (of course for admin I am comparing with ServerFault).

Still, I wanted to test my perception that I no longer see Experts Exchange results in Google. I had a look at the Alexa stats for the site.


Wow! That vertical line is around April 2011, which is when Google rolled out its "High Quality Sites Algorithm". The site still ranks in the top 3000 in the world according to Alexa – 2787 at the time of writing – but according to the chart it lost around 50% of its visitors then, and has since declined further.

As noted above, the site is controversial, but I personally never minded seeing Experts Exchange results in my searches since the advice there is often good.

The bit that disturbs me though is simply the power Google has over what we read on the Internet. I appreciate the reasons, but it is not healthy for one corporation to have this level of influence, especially bearing in mind the black box nature of its workings.

Hard drive shortage, price madness

Now and again in the computer industry there is a shortage of components, everyone panic buys the stock and prices shoot up.

This is happening now with hard drives. Here is what Seagate told its partners:

As has been widely reported, the severe flooding in Thailand is a tragic situation for families and businesses across the region. Currently, all Seagate facilities in Thailand are operational and our production is not constrained by either internal component supply or by our ability to assemble finished products. Rather, we are constrained by the availability of specific externally sourced components. As a result, industry demand will significantly outstrip supply at least for the December quarter and the supply disruption will continue for multiple quarters.

How long the disruption will last is hard to guess, but bearing in mind that manufacturers will be racing to restore production I doubt it will be really long-lived.

In the meantime though, buyer beware. Drives that you could once find for £50 or so in the UK are suddenly three times the price.


The best advice is to postpone that upgrade you were planning. If you cannot wait, it is still worth shopping around.