Going Mobile

In the back of my mind I knew that this blog looked terrible on a mobile, but I did nothing about it until @monkchips complained that it was unreadable on his HTC Magic, which runs Google Android 1.6.

I don’t have an Android device, but I grabbed the SDK, ran up the emulator, and had a look. The page took ages to load, and did not work properly even when fully loaded.

I figured “there’s a plugin for that”, and there is – several in fact. I settled on the WordPress Mobile Pack. Installed, configured, and a short time later was up and running.

I had a few hassles, mainly because most of my wordpress installation is not writeable by the web server, and this plugin needs to write themes on installation and temporary images after that, so I had to loosen permissions slightly. I then set the themes directory back to read-only, and configured the cache so that Apache will only serve images.

I still only get a score of Fair (2 fails) from the MobiReady report. Still, progress. I am ahead of bbc.co.uk which gets Bad (10 fails); but behind microsoft.com which rates Good (0 fails).

The plugin also tells me that 5% of the traffic to this site is from mobile users. More than I had expected.

Beep beep.

Windows Presentation Foundation now ready, too late

The immortal film The Railway Children has a scene in which a band plays during an award presentation. Unfortunately a series of false starts delay the performance, until finally it all comes together and the music begins. The camera pans – the audience has already departed.

Is it like that for WPF (Windows Presentation Foundation), Microsoft’s user interface framework which is built on .NET and DirectX and was intended to replace the ancient GDI (Graphics Device Interface) and GDI+?

In this new post I make the case that with WPF 4.0 is the framework is now truly ready to use, not least because Microsoft itself is using it in Visual Studio and the interaction between these two teams has solved a number of problems in WPF.

But who now wants to develop just for Windows? Well, it makes sense in some contexts, though I note that in the Thoughtworks paper on emerging technology and trends about which I wrote yesterday, neither Windows nor WPF gets a mention. Nor for that matter does the Mac, Linux, or OS X, though iPhone and Android feature strongly. The only emerging desktop technology that interests Thoughtworks is the browser.

Technology trends: Silverlight, Flex little use says Thoughtworks as it Goes Google

Today Martin Fowler at Thoughtworks tweeted a link to the just-published Thoughtworks Technology Radar [pdf] paper, which aims to “help decision makers understand emerging technologies and trends that affect the market today”.

It is a good read, as you would expect from Thoughtworks, a software development company with a bias towards Agile methodology and a formidable reputation.

The authors divide technology into four segments, from Hold – which means steer clear for the time being – to Adopt, ready for prime time. In between are Assess and Trial.

I was interested to see that Thoughtworks is ready to stop supporting IE6 and that ASP.NET MVC is regarded as ready to use now. So is Apple iPhone as a client platform, with Android not far behind (Trial).

Thoughtworks is also now contemplating Java language end of life (Assess), but remains enthusiastic about the JVM as a platform (Adopt), and about Javascript as a first class language (also Adopt). C# 4.0 wins praise for its new dynamic features and pace of development in general.

Losers? I was struck by how cool Thoughtworks is towards Rich Internet Applications (Adobe Flash and Microsoft Silverlight):

Our position on Rich Internet Applications has changed over the past year. Experience has shown that platforms such as Silverlight, Flex and JavaFX may be useful for rich visualizations of data but provide few benefits over simpler web applications.

The team has even less interest in Microsoft’s Internet Explorer – even IE8 is a concern with regard to web standards – whereas Firefox lies at the heart of the Adopt bullet.

In the tools area, Thoughtworks is moving away from Subversion and towards distributed version control systems (Git, Mercurial).

Finally, Thoughtworks is Going Google:

At the start of October, ThoughtWorks became a customer of Google Apps. Although we have heard a wide range of opinions about the user experience offered by Google Mail, Calendar and Documents, the general consensus is that our largely consultant workforce is happy with the move. The next step that we as a company are looking to embrace is Google as a corporate platform beyond the standard Google Apps; in particular we are evaluating the use of Google App Engine for a number of internal systems initiatives.

A thought-provoking paper which makes more sense to me than the innumerable Gartner Magic Quadrants; I’d encourage you to read the whole paper (only 8 pages) and not to be content with my highlights.

Seven years of blogging, and a redesign

This blog began in 2003, though the website goes back to 2000, and I now see little difference between what is now a blog, and what in 2000 was a more painful process of authoring web content, especially with the decline of RSS readers. Still, my first blogging efforts were powered by a now-defunct project called bblog. I modified this heavily to add features and cope with comment spam – almost non-existent in 2003 – and then in 2006 accepted that I would be better off with a mainstream blog engine and selected WordPress, which has exceeded my expectations.

When I moved to WordPress I picked a theme which met my requirements, then modified it to tidy up the layout and to support non-intrusive advertising. I found myself to some extent boxed in once again, since I could not change or upgrade the theme without losing my modifications. This also meant I was missing out on newer features of WordPress. Widget support is a breakthrough feature, letting you add features to the site through a simple drag-and-drop admin page, but I could not use them. I also wanted to support gravatars, which show an image chosen by the author alongside their comments, and to add a ratings system.

Ratings are a lot of fun, though not really reliable as a gauge of quality. If your article extolling the merits of the Xbox 360 gets linked by a PlayStation fan site, or your article critical of Apple gets linked by an Apple fan site, there is little chance of a fair rating. Some readers also find it difficult to separate what they think about the subject matter from what they think about the quality of reporting. Even so, ratings are always interesting and I’d like to include a list of best-rated posts.

It has taken me some time to find a theme that looked right for my needs, but I have now settled on Atahualpa from BytesForAll. It is a popular theme, so my blog will look similar to many others, but it is flexible and I’ve been able to add the most important features by modifying settings rather than editing the raw PHP, a critical issue for upgradability. I’ve also added rating support with GD Star Rating.

As ever, it is work in progress, and I expect to modify the design and add features as time allows. Although it may not look much improved yet, it is much easier to modify in a maintainable fashion, so expect more changes soon.

What’s wrong with Microsoft Hotmail?

Joe Wilcox has a good post about Microsoft’s decade of shattered dreams. These are all things in which the company invested, but did not get right: eBooks, HailStorm web services, digital music, Origami small computing devices.

The list is longer than that of course. Tablet PC is a big one; we’ll see what happens to Apple’s efforts. And when I researched a retrospective on .NET recently I was struck by how well Microsoft got the mash-up idea: building block services pulled together by .NET web sites or client applications.  And let’s not forget that Microsoft demonstrated Ajax partial page refresh in September 2000. These ideas have not been total failures at Microsoft, but their potential has been realised mainly by others.

That brings me to Hotmail (also known as Live Mail), the web-based email service that launched in 1996 and was acquired by Microsoft in late 1997. Microsoft was long teased for keeping it running on Unix while promoting Windows Server; that is emphatically no longer the case, as explained in two recent blogs by Arthur de Haan and Dick Craddock. It was moved off Unix in 2004, and rewritten in C# and ASP.NET in 2005. According to de Haan, it is the largest SQL Server 2008 deployment in the world. Impressive.

It would be absurd to call Hotmail a failure, when it has 1.3 billion inboxes and 350 million active users. Nevertheless, when I read or hear people recommending a web-based email service, it is almost always Google Gmail, not Hotmail (nor Yahoo for that matter). There are several people with whom I communicate professionally at Gmail addresses, none that I can think of on Hotmail.

Last year, Information Week reported that in the USA Gmail was set to overtake Hotmail in 2009; I do not know if it did so, but it would not surprise me, though internationally Hotmail is likely still ahead. Yahoo was well ahead of both, but will not be immune to the Google effect.

How has Google managed to steal mindshare away from Microsoft’s long-established service?

One reason is that Google got it right pretty much from the first public beta, whereas Hotmail has made pretty much every mistake in the book, though it has gradually corrected most of them. For a long time my Hotmail account was nearly unusable because of spam, whereas Gmail has great spam filters. Hotmail had inadequate storage, until Gmail turned up with 1GB of storage and its competitors quickly followed suit.

Another factor is the user experience. When I go to Gmail, I get a full page dedicated to email, and it is responsive and generally pleasant to use. The Hotmail UI is busier, the ads are more intrusive, and it takes longer to load.

Still, Hotmail is usable and much better than it once was. What else is wrong?

There is a clue in comments to de Haan’s blog. Hotmail has traditionally been awkward if you want to use offline mail clients, which is odd considering Microsoft’s “software plus services” approach. The Outlook Live Connector has always been troublesome. POP3 support eventually arrived, but users want IMAP as offered by Google.

Another problem is that Hotmail has never seemed core to Microsoft’s strategy. We all know how Microsoft does email, and it is not Hotmail, it is Exchange. Hotmail is a consumer service. Both marketing and product integration efforts are mainly focused on Exchange.

Despite its 350 million users, I reckon Hotmail needs a Bing-style makeover.

Joining the Smartphone dots

Google has made a big splash with its launch of Nexus One, even though technically it is not all that exciting. A neat phone; 1 Ghz Qualcomm processor; runs Android 2.1; good for web video with its inclusion of Adobe Flash 10.1, along with the ability to capture your own videos at 20 frames per second in 720×480 pixels. No keyboard though; and the q&a at the press briefing revealed a few limitations, such as lack of tethering support (using the phone to connect a laptop to the Internet), and that downloaded applications all end up in the 512MB on-board RAM rather than on an SD card, making it more likely that you will run out of space. Tethering is being worked on, apparently, and the application restriction is for copy protection, supposedly making it more difficult to pirate paid-for downloads.

My biggest disappointment is the price. It is a fraction cheaper than an Apple iPhone, but still far from a mass market product; though it won’t feel that way in the tech influencer community.

All this is rather unimportant; even prices will fall eventually. What matters is that attention is shifting from web+desktop (or laptop) to web+smartphone as the computing platform of the moment. That shift is far from complete; most of us still need the large screen and comfortable keyboard of a laptop to do our work. It is real though, and it is obvious that the need to carry around a bulky laptop with a short battery life is diminishing. Netbooks and Apple’s rumoured tablet are part of the same movement towards smaller, lighter and web-connected.

Although these gadgets are getting more capable, there is no sign of them following the desktop model with feature-rich local applications and heavy use of local storage. The applications being downloaded in huge numbers from Apple’s app store – a breathtaking three billion to date according to today’s announcement – are small, single-purpose apps where speed and usability is valued over richness of features, and where data comes from the Internet. This is the new model of application development.

Google’s announcement is also an important move in the identity wars. Most computer users have multiple identities: maybe an Active Directory account on a Microsoft network, a Facebook account, an Apple ID for iTunes and MobileMe, a Google account for Gmail and Google Docs. All these competing players gain hugely if they can increase the importance of your identity on their platform versus the others. If Microsoft can keep your Active Directory account at the centre of your world, then you will be a customer for Exchange, Office, SharePoint and so on. On the other hand, if your Google sign-in becomes more important, then Google’s products are correspondingly more attractive and it can sell you more services and advertising. Buy a Google phone and you hook directly into Google’s world. In ChromeOS the link is even more obvious, since you sign onto the computer with your online Google credentials.

The power shift is obvious. And as Tim O’Reilly implies in his excellent post, Google’s lack of legacy desktop baggage is helping it to compete against Apple as well as Microsoft.

How Intel’s compiler underperforms on other CPUs: artificial impairment versus failure to optimise

Last month the US Federal Trade Commission sued Intel for anti-competitive practices; and in my post on the subject I tried to make sense of part of the FTC’s complaint, that:

Intel secretly redesigned key software, known as a compiler, in a way that deliberately stunted the performance of competitors’ CPU chips.

A few days ago Agner Fog wrote an article that sheds some light on the subject:

The reason is that the compiler or library can make multiple versions of a piece of code, each optimized for a certain processor and instruction set, for example SSE2, SSE3, etc. The system includes a function that detects which type of CPU it is running on and chooses the optimal code path for that CPU. This is called a CPU dispatcher. However, the Intel CPU dispatcher does not only check which instruction set is supported by the CPU, it also checks the vendor ID string. If the vendor string says “GenuineIntel” then it uses the optimal code path. If the CPU is not from Intel then, in most cases, it will run the slowest possible version of the code, even if the CPU is fully compatible with a better version.

Fog also notes a clause in Intel’s November 2009 settlement with AMD [PDF] in which the company undertakes not to:

include any Artificial Performance Impairment in any Intel product, or require any Third Party to include an Artificial Performance Impairment … “Artificial Performance Impairment” means an affirmative engineering or design action by Intel (but not a failure to act) that (i) degrades the performance or operation of a Specified AMD product (ii) is not a consequence of an Intel Product Benefit and (iii) is made intentionally to degrade the performance or operation of a Specified AMD product.

It is a fine distinction. At what point does failure to optimise constitute “affirmative engineering”? What riles developers is that even when an non-Intel CPU reports support for an optimisation such as SSE, the Intel-compiled code will not use it unless it is an Intel CPU. You could argue that this is an inaction (failure to optimise) rather than an action (deliberately running slower); but the end result is the same, worse performance on non-Intel processors.

The obvious practical solution is to use other compilers, but for certain types of work Intel’s compiler is considered the best.

Has Intel now agreed to do a better job? You tell me; I don’t think the clause quoted above tells us one way or another. I do think it is legitimate for the government to press Intel at least to take advantage of obvious optimisations on third-party processors, since this benefits everyone. Even so, Intel will always optimise best for its own CPUs and that is to be expected.

Performance tests are another issue. It is deceptive to produce test results showing performance differences without also revealing that in one case the code is optimised, and in another it is not. That said, if Intel has a smart optimisation that is specific to its own CPUs, there is no reason why it should not trumpet the fact. This is a matter of disclosure.

Finally, developers take note: if you are compiling for a general market that might or might not be using Intel CPUs, maybe the Intel compiler is not the best choice.

Technorati Tags: ,,