A year of blogging: another crazy year in tech

At this time of year I allow myself a little introspection. Why do I write this blog? In part because I enjoy it; in part because it lets me write what I want to write, rather than what someone will commission; in part because I need to be visible on the Internet as an individual, not just as an author writing for various publications; in part because I highly value the feedback I get here.

Running a blog has its frustrations. Adding content here has to take a back seat to paying work at times. I also realise that the site is desperately in need of redesign; I’ve played around with some tweaks in an offline version but I’m cautious about making changes because the current format just about works and I don’t want to make it worse. I am a writer and developer, but not a designer.

One company actually offered to redesign the blog for me, but I held back for fear that a sense of obligation would prevent me from writing objectively. That said, I have considered doing something like Adobe’s Serge Jespers and offering a prize for a redesign; if you would like to supply such a prize, in return for a little publicity, let me know. One of my goals is to make use of WordPress widgets to add more interactivity and a degree of future-proofing. I hope 2010 will be the year of a new-look ITWriitng.com.

So what are you reading? Looking at the stats for the year proves something I was already aware of: that the most-read posts are not news stories but how-to articles that solve common problems. The readers are not subscribers, but individuals searching for a solution to their problem. For the record, the top five in order:

Annoying Word 2007 problem- can’t select text – when Office breaks

Cannot open the Outlook window – what sort of error message is that? – when Office breaks again

Visual Studio 6 on Vista – VB 6 just won’t die

Why Outlook 2007 is slow- Microsoft’s official answer – when Office frustrates

Outlook 2007 is slow, RSS broken – when Office still frustrates

The most popular news posts on ITWriting.com:

London Stock Exchange migrating from .NET to Oracle/UNIX platform -  case study becomes PR disaster

Parallel Programming: five reasons for caution. Reflections from Intel’s Parallel Studio briefing – a contrarian view

Apple Snow Leopard and Exchange- the real story – hyped new feature disappoints

Software development trends in emerging markets – are they what you expect?

QCon London 2009 – the best developer conference in the UK

and a few others that I’d like to highlight:

The end of Sun’s bold open source experiment – Sun is taken over by Oracle, though the deal has been subject to long delays thanks to EU scrutiny

Is Silverlight the problem with ITV Player- Microsoft, you have a problem – prophetic insofar as ITV later switched to Adobe Flash; it’s not as good as BBC iPlayer but it is better than before

Google Chrome OS – astonishing – a real first reaction written during the press briefing; my views have not changed much though many commentators don’t get its significance for some reason

Farewell to Personal Computer World- 30 years of personal computing – worth reading the comments if you have any affection for this gone-but-not-forgotten publication

Is high-resolution audio (like SACD) audibly better than than CD – still a question that fascinates me

When the unthinkable happens: Microsoft/Danger loses customer data – as a company Microsoft is not entirely dysfunctional but for some parts there is no better word

Adobe’s chameleon Flash shows its enterprise colours – some interesting comments on this Flash for the Enterprise story

Silverlight 4 ticks all the boxes, questions remain – in 2010 we should get some idea of Silverlight’s significance, now that Microsoft has fixed the most pressing technical issues

and finally HAPPY NEW YEAR

Browser wars: IE loses 12% market share in 2009, Germany hates it

I’ve been looking at the browser stats for 2009. According to StatCounter, Microsoft began the year with a 67.19% share for IE (versions 6-8 combined) and ends it with 55.23%. That’s a 12% loss or an 18% decline, depending how you figure it.

The biggest part of that share has gone to Firefox, which started with 25.08% and closes with 31.92 – a gain of 7% or a rise of  27%.

The big story is that Firefox 3.5 is now the world’s most popular browser. Although true on these figures, it is also true that IE 7 on the way down is crossing IE 8 on the way up; it’s possible that IE 8 will overtake Firefox sometime next year, though by no means certain.

However, there are huge regional variations. The UK loves IE: currently IE 8 is on 31.48% vs Firefox 3.5 on 19.2%, and IE overall is 56.02%. Germany on the other hand hates it:

According to these stats, Firefox 3.5 has a 44.19% share in Germany and IE 8 just 15.32%. The USA is somewhere in between, though closer to the UK in that IE 8 is in the lead with 26.64%.

Overall, clearly a good year for Mozilla and a bad one for Microsoft.

What about the future? Well, it’s notable that not all IE migrants are going to Firefox. The Other section is showing steady increase, and I’d bet that a large chunk of Other is based on WebKit, either in mobile browsers or in Google Chrome. Apple’s Safari is also WebKit-based, and has increased its share significantly during 2009. Mozilla should worry that developers are largely choosing WebKit rather than Gecko.

A bigger concern for Mozilla is the big G, source of most of its income. Google pays Mozilla for search traffic sent its way. It cannot be good when your main customer has a product that competes directly with your own. I’m guessing that a Google browser will overtake Firefox during the next decade.

More patent nonsense: Microsoft loses in Office custom XML appeal

Microsoft has lost its appeal in a case where a small company called i4i claims that Office 2003 and 2007 infringes its patent on embedding custom XML within a Word document. This is not the XML that defines the content and layout of the document. It is XML contained within the document that Word itself does not understand, because it conforms to a custom schema, and which will not be displayed unless you write code to parse it and output some sort of result to the document.

Microsoft now says:

With respect to Microsoft Word 2007 and Microsoft Office 2007, we have been preparing for this possibility since the District Court issued its injunction in August 2009 and have put the wheels in motion to remove this little-used feature from these products. Therefore, we expect to have copies of Microsoft Word 2007 and Office 2007, with this feature removed, available for U.S. sale and distribution by the injunction date.  In addition, the beta versions of Microsoft Word 2010 and Microsoft Office 2010, which are available now for downloading, do not contain the technology covered by the injunction.

The key phrase here is “little used feature”. It is true, in that the vast majority of Word documents do not use it; the only users who will be affected will be those who have built custom solutions which use it in some kind of workflow or for data analysis.

Why did Microsoft lose? Here I have to admit my lack of legal knowledge; though I’m aware that Microsoft’s track record in court is not good. One interesting aspect of the case reported here is that Microsoft was proven, by an email from January 22 2003, to have been aware of the patent and products from i4i:

we saw [i4i’s products] some time ago and met its creators. Word 11 will make it obsolete

says the internal email; Word 11 is another name for Word 2003.

That said, intuitively both the patent and the decision seem odd to me, in that XML is specifically designed to allow data with a custom schema to be embedded within a document defined by another schema. But does the i4i patent cover every XML document out there that does this – such as, for example, XHTML documents that include microformats? The answer, as I understand it, is no, because the patent is about how the custom XML is stored, not that it exists. Here’s a quote from the patent itself:

The present invention is based on the practice of separating encoding conventions from the content of a document. The invention does not use embedded metacoding to differentiate the content of the document, but rather the metacodes of the document are separated from the content and held in distinct storage in a structure called a metacode map, whereas document content is held in a mapped content area … delivering a complete document would entail delivering both the content and a metacode map which describes it.

In other words, the custom XML is not stored directly within the containing document, but in a separate file, together with an instruction that says “please insert me at location x”.

Is that really any different? Intuitively, I doubt it. What we think of as single files are often in reality a number of sections bundled together, such as a header part and a content part. Further, what we think of as a single file may be stored in several locations, with metadata that defines how to get from one part to the next.

An Office 2007 document such as .docx is in reality a ZIP archive which contains several separate files, organised according to the Open Packaging Convention; if the i4i patent has wider implications, it strikes me that they would be for the OPC rather than for XML itself.

I don’t claim any expertise in whether or not i4i has a valid claim against Microsoft or others. I do have an opinion though, which is that this kind of patent litigation does not benefit either the industry or the general public. This particular case concerns me, because the patent strikes me as generic, and one that could be applied elsewhere, which means more effort expended to workaround legal issues rather than in improving the software we use; and because even if the feature in Word is “little used”, the concept is an important one that still has great potential – though now probably not in Microsoft Office.

Technorati Tags: ,,,

Splashtop: the pragmatic alternative to ChromeOS

Today I received news of the a new Eee PC range from Asus which will be based on the Intel Atom N450. Two things caught my eye. One was the promise of “up to 14 hours of battery life”. The other was the inclusion of dual-boot. The new range offers both Windows 7 and what Asus calls Express Gate, a lightweight Linux which boots, it is claimed, in 8 seconds.

Express Gate is a version of Splashtop, and is a web-oriented OS that offers a web browser based on Firefox, a music player, and instant messaging. There is also support for:

View and edit Microsoft Office compatible documents as well as the latest Adobe PDF formats

though whether that means OpenOffice or something else I’m not yet sure. The Adobe Flash runtime and Java are included, and you can develop custom applications. Citrix Receiver and VMware View offer the potential of using Splashtop as a remote desktop client.

The idea is that you do most of your work in Windows, but use Splashtop when you need access right now to some document or web site. I can see the value of this. Have you ever got half way to a meeting, and wanted to look at your email to review the agenda or location? I have. That said, a Smartphone with email and web access meets much of this need; but I can still imagine times when a larger screen along with access to your laptop’s hard drive could come in handy.

The concept behind Splashtop has some parallels with Google’s ChromeOS, which also aims to “get you onto the web in a few seconds”. The Asus package includes up to 500GB of free web storage, and of course you could use Google’s email and applications from Splashtop. Another similarity is that Splashtop claims to be:

a locked-down environment that is both tamper proof and malware/virus resistant.

That said, ChromeOS is revolution, Splashtop is evolution. The Google OS will be a pure web client, according to current information, and will not run Windows or even Linux desktop applications. Knowing Google, it will likely be well executed and easy to use, and more polished than versions of Splashtop hurriedly customised by OEM vendors.

Splashtop on the other hand arrives almost by stealth. Users are getting a Windows netbook or laptop, and can ignore Splashtop if they wish. Still, that fast boot will make it attractive for those occasions when Splashtop has all you need; and frankly, it sounds as if successfully captures 80% of what many users do most of the time. Splashtop could foster a web-oriented approach for its users, supplemented with a few local applications and local storage; and some may find that it is the need for Windows that becomes a rarity.

It is telling that after years of hearing Microsoft promise faster boot times for Windows – and in fairness, Windows 7 is somewhat quicker than Vista – vendors are turning to Linux to provide something close to instant-on.

Moonlight 2 released; no Microsoft codecs unless you get it from Novell

The Mono Project has released Moonlight 2, its implementation of Silverlight for Linux. I tried my own database application and was pleased to find that it works fine; better than it did with the earlier release.

Note the right-click menu which offers some handy debugging features as well as the invitation to “Install Microsoft Media Pack”. If you choose this, you get a dialog offering the Microsoft codecs which are downloaded from Microsoft, not from Mono servers. You have to agree a EULA that restricts use to Moonlight running in a web browser.

That last bit is intriguing; it seems Microsoft is trying to prevent desktop or out-of-browser Moonlight (or Mono) from taking advantage of its codecs.

So what is in Moonlight 2? Miguel de Icaza explains:

Moonlight 2 is a superset of Silverlight 2. It contains everything that is part of Silverlight 2 but already ships with various features from Silverlight 3.

Those additional features include the pluggable pipeline, easing animation support, writeable bitmaps, and partial out-of-browser support. Further, de Icaza says:

We are moving quickly to complete our 3 support. Microsoft is not only providing us with test suites for Moonlight but also assisting us in making sure that flagship Silverlight applications work with Moonlight.

There is also a new patent covenant that:

ensures that other third party distributions can distribute Moonlight without their users fearing of getting sued over patent infringement by Microsoft

That said, the media pack is a source of friction. Only the Novell Moonlight distribution will raise the above dialog to install the Microsoft codecs; others will have to make their own arrangements; at least that is how I understand de Icaza’s post.

It seems an odd restriction, and means that most users should download from Novell.

What is the future of Microsoft Small Business Server?

I’ve just attended a briefing on Microsoft Server and the future of the Small Business variant was one of the things we discussed.

There are a couple of issues with Small Business Server that make me question its future. One is that, at a time when cloud-based services are proving their ability to simplify computing for small businesses, Microsoft’s offering is more or less cloud-free.

A second issue is that by bundling onto one machine products that were designed to live on separate servers, Microsoft has made Small Business Server more complex to manage than a grown-up Windows server environment, especially when upgrading to a new version.

I’d like to see SBS migrated to a virtual environment, with separate VMs for Exchange, SharePoint and Active Directory, all running as virtual machines. This is more or less how I run my own test system, and it works very well. It is more flexible, less fragile, requires no special tuning, and is easier to look after than single-server SBS.

That of course presumes that you think there still is a need for SBS at all. The other scenario I’d like to see enabled is one where the on-premise server is in effect a cache for cloud-based services. If a disaster occurs, there would be no interruption of business.

But what does Microsoft have in mind? It is not saying, though I was assured that it is an area of continuing investment – in other words, there will be another Small Business Server – and that sales remain healthy (then again, vendors always say that).

One of the complications for Microsoft is that SBS is generally installed and maintained by partners (of varying levels of competence) and it will take courage to disrupt that business. More than likely we will just get SBS 2010 with Exchange 2010, 2008 R2 and so on. In other words, more of the same.

US Federal Trade Commission sues Intel

Just as the EU declared victory over Microsoft having secured the dubious benefit of a browser “choice screen” – I’m wondering if users will suspect malware when this thing appears – the FTC has stepped in with an anti-trust case of its own.

the FTC alleges that Intel has waged a systematic campaign to shut out rivals’ competing microchips by cutting off their access to the marketplace. In the process, Intel deprived consumers of choice and innovation in the microchips that comprise the computers’ central processing unit, or CPU.

The FTC’s main complaint is that Intel allegedly used:

threats and rewards aimed at the world’s largest computer manufacturers, including Dell, Hewlett-Packard, and IBM, to coerce them not to buy rival computer CPU chips. Intel also used this practice, known as exclusive or restrictive dealing, to prevent computer makers from marketing any machines with non-Intel computer chips.

There is also a complaint about GPUs, though it’s not yet clear to me whether this is because the FTC considers that bundling a GPU with the motherboard or CPU sale is anti-competitive in itself, or some other issue.

The FTC also makes what to me is an intriguing complaint about Intel’s compiler:

In addition, allegedly, Intel secretly redesigned key software, known as a compiler, in a way that deliberately stunted the performance of competitors’ CPU chips. Intel told its customers and the public that software performed better on Intel CPUs than on competitors’ CPUs, but the company deceived them by failing to disclose that these differences were due largely or entirely to Intel’s compiler design.

I am struggling a bit with this one. Is the FTC saying that Intel’s compiler has code in it that says in effect:

if (!IntelCompiler) { GoSlowly(); }

Or is the FTC merely saying that the Intel compiler optimises for Intel hardware – which is not a a secret; I’ve been told this by Intel on many occasions.

Clearly if customers were deceived by this in demonstrations the FTC has a point; but the kind of customer who is interested in CPU performance tests is likely to be familiar with the idea that compilers can optimise for specific hardware. It’s a good thing too; if I have Intel hardware I want a compiler that will optimise for features specific to that hardware.

Intel has issued the following statement:

“Intel has competed fairly and lawfully. Its actions have benefited consumers. The highly competitive microprocessor industry, of which Intel is a key part, has kept innovation robust and prices declining at a faster rate than any other industry. The FTC’s case is misguided. It is based largely on claims that the FTC added at the last minute and has not investigated. In addition, it is explicitly not based on existing law but is instead intended to make new rules for regulating business conduct. These new rules would harm consumers by reducing innovation and raising prices.”

Intel senior vice president and general counsel Doug Melamed added, “This case could have, and should have, been settled. Settlement talks had progressed very far but stalled when the FTC insisted on unprecedented remedies – including the restrictions on lawful price competition and enforcement of intellectual property rights set forth in the complaint — that would make it impossible for Intel to conduct business.”

“The FTC’s rush to file this case will cost taxpayers tens of millions of dollars to litigate issues that the FTC has not fully investigated. It is the normal practice of antitrust enforcement agencies to investigate the facts before filing suit. The Commission did not do that in this case,” said Melamed

Technorati Tags: ,

Adobe financials and the future of packaged software

I listened to Adobe’s investor conference call yesterday following the release of its fourth quarter results, to the end of November 2009.

The results themselves were mixed at best: revenue was down in all segments year on year and there was a $32 million GAAP net loss, but Adobe reported an “up-tick” towards the end of the quarter and says that it expects a strong 2010, presuming a successful launch for Creative Suite 5.

Adobe’s situation is interesting, in that while it is doing well in strengthening the Flash Platform for media and to a lesser extent for applications, that success is not reflected in its results.

The reason is that it depends largely on sales of design software (mainly Creative Suite) for its revenue. According to its datasheet [PDF], this was how its revenue broke down for the financial years 2006 to 2009:

  2006 2007 2008 2009
Creative 56% 60% 58% 58%
Business Productivity 32% 29% 30% 29%
Omniture (analytics) 1%
Platform 4% 4% 6% 6%
Print and publishing 8% 6% 6% 6%

“Creative” is Creative Suite and its individual products, plus things like Audition and Scene 7.

“Business productivity” encompasses Acrobat (including Acrobat.com), LiveCycle servers, and Connect Pro web conferencing.

“Platform” is developer tools and Flash Platform Services, though not LiveCycle Data Services.

“Print and Publishing” is PostScript, Director, Captivate, and old stuff like PageMaker and FrameMaker but not InDesign.

Some of this segmentation seems illogical to me and probably to Adobe as well; there are no doubt historical reasons.

If the economy recovers and Creative Suite 5 delivers a strong upgrade, Adobe may well have the good 2010 that it is hoping for. One of the things mentioned by CEO Shantanu Narayen was that an aging installed base of PCs more than five years old was holding back its sales; no doubt most of those PCs are running Windows XP and it caused me to wonder how much the general disappointment with Vista has affected other companies such as Adobe which benefit when PCs are upgraded, and how much the good reception for Windows 7 may now help it.

Still, there is aspect of the above figures that rings alarm bells for me. They show no evidence that Adobe is able to migrate its business from one dependent on packaged software sales to one that is service-based. That is important, because I suspect that the packaged software model is in permanent decline.

The pattern which I’ve seen now for many years as a software reviewer is that a vendor brings out version x of its product and explains why it is a must-have upgrade from version x-1, which (it turns out) has a number of deficiencies that are only now being addressed.

A year or two later, there’s another upgrade, another briefing, and lo! it is version x+1 that you really need; version x was not that good after all.

It is a difficult act for vendors to sustain, and hated by users too. Even when users have signed up for some sort of service contract that gets them new releases for free, many are reluctant to upgrade because of the pain factor; if the old edition is performing well, they see no need to switch.

The next-generation software world replaces this model with Internet applications where upgrade is seamless and at no extra cost. You pay for the service, either with money (Salesforce.com) or mainly with advertising (Google Apps).

Adobe is there, of course, with Acrobat.com for productivity applications, and also tools for building them with Flash, Flex and AIR. But it is one thing to be there, and another thing for those investments to be delivering an increasing proportion of overall revenue; and the table above suggests that progress is slow.

It will be fascinating to see how this unfolds over the coming decade.

Reflections on Microsoft PDC 2009

Microsoft’s Professional Developers Conference has long been a key event in the company’s calendar. CEO Steve Ballmer and his colleagues are famous for their belief that developers make or break a platform, and PDC is where the most committed of those developers learn as much as Microsoft is willing to share of its long-term plans. There have been good – for example, 2000 C# and .NET launch, 2008 Windows 7 – and bad – for example, 2001 Hailstorm, 2003 Longhorn – PDCs but they have all been interesting, at least the ones I have attended.

So how was PDC 2009? While there was a ton of good content there, and an impressive launch for Silverlight 4, there was a noticeable lack of direction; maybe that was why Ballmer decided not to show up. It should have been the Windows Azure PDC, but as I have just written elsewhere, Microsoft has little excitement about its cloud. Chief Software Architect Ray Ozzie gave almost exactly the same keynote this year that he gave last year; and the body language, as it were, is more about avoiding the cloud than embracing it. Cross-platform clients, commodity pricing, throw away your servers: from Microsoft’s point of view, what’s not to hate?

In theory, mobile computing could have been another big story at the PDC, but Microsoft’s slow progress in Mobile is well known.

My instinct is that Microsoft needs to change but does not know how: the wheels continue to turn and we will get new versions of Windows, ever more complex iterations of Windows Server, Exchange, SharePoint, and feature after feature added to Microsoft Office – does it really need to become PhotoShop – but in the end this is more of the same.

The mitigating factors are the high quality of Windows 7, which will drive a lot of new PC sales for a quarter or two, and the strong products coming out of the developer division. Visual Studio 2010 plus Silverlight is an interesting platform, and ASP.NET MVC is in my opinion a big advance from Web Forms.

That’s not enough though; and we still await a convincing strategic discussion of how Microsoft intends to flourish in the next decade.

Technorati Tags: ,,

When backups fail

Jeff Attwood has lost the content from two popular blogs that he runs:

http://blog.stackoverflow.com
http://www.codinghorror.com

thanks to:

100% data loss at our hosting provider, CrystalTech.

He gives a little more detail here. He is now trying to recover data from search engine caches such as Google’s – a painful business, apparently; Google banned his IP.

Backup is a complex problem. I’d been meaning to post on the subject following another recent incident. Here’s a quote from an email a friend received from his ISP after asking whether the SQL Server database was backed up:

Needless to say, we do back the databases up every 12 hours to a remote location automatically.

Just 11 days later “a crucial disk” failed on that SQL Server; following which the ISP discovered that its recent back-ups were also “corrupt” and data was lost. In the end a data recovery specialist was enlisted and most, but not all data recovered.

No doubt the post-mortem will reveal multiple issues; but it shows that knowing backups are being done is insufficient. You have to do test restores as well, because the backup might not be working as well as you think.

In addition, as Attwood is now tweeting:

don’t trust the hosting provider, make your OWN offsite backups, too!

Good advice for those of us using commodity ISPs. But it also gives me pause for thought following the CloudForce event I attended earlier this week. A specialist like Salesforce.com has more resources to put into data resilience than any of its users. So if Salesforce.com (or Amazon, or Google, or Microsoft) is your ISP, is it then OK to leave backup to them?

Technorati Tags: ,