OOXML vs ODF: where next for interoperability?

Gary Edwards of the Open Document Foundation has a fascinating post on the important of Microsoft Office compatibility to the success of the ISO-approved Open Document formats.

It is in places a rare voice of sanity:

People continue to insist that if only Microsoft would implement ODF natively in MSOffice, we could all hop on down the yellow brick road, hand in hand, singing kumbaya to beat the band. Sadly, life doesn’t work that way. Wish it did.
Sure, Microsoft could implement ODF – but only with the addition of application specific extensions to the current ODF specification … Sun has already made it clear at the OASIS ODF TC that they are not going to compromise (or degrade) the new and innovative features and implementation model of OpenOffice just to be compatible with the existing 550 million MSOffice desktops.


The simple truth is that ODF was not designed to be compatible – interoperable with existing Microsoft documents, applications and processes. Nor was it designed for grand convergence. And as we found out in our five years participation at the OASIS ODF TC, there is an across the boards resistance to extending ODF to be compatible with Microsoft documents, applications and processes.

Summary: in Edwards’ opinion, there are technical and political reasons why seamless ODF interop cannot be baked into Microsoft Office. Therefore the Foundation is now working on interop with the W3C’s Compound Document Format, about which I know little.

Surprisingly, Edwards also says that ODF will fail in the market:

If we can’t convert existing MS documents, applications and processes to ODF, then the market has no other choice but to transition to MS-OOXML.

Edwards is thoroughly spooked by the success of Sharepoint in conjunction with Exchange, and overstates his case:

If we can’t neutralize and re purpose MSOffice, the future will belong to MS-OOXML and the MS Stack. Note the MS Stack noticeably replaces W3C Open Web technologies with Microsoft’s own embraced “enhancements”. Starting with MS-OOXML/Smart Tags as a replacement for HTML-XHTML-RDF Metadata. HTML and the Open Web are the targets here. ODF is being used as a diversion from the real end game – the taking of the Internet.

I find this implausible. At the same time, I agree about the importance of interoperability with Microsoft Office.

I would also like clarification on what are the limitations of OOXML / ODF conversion. Here’s a technique that does a reasonable job. Open OOXML in Microsoft Office, save to binary Office format. Open binary Office format in Open Office, save as ODF. The same works in reverse. Not perfect perhaps, but a whole lot better than the Microsoft-sponsored add-in that works through XSLT.  Could this existing Open Office code be made into a Microsoft Office plug-in, and if so, what proportion of existing documents would not be satisfactorily converted?

Note that Sun’s ODF converter seems to be exactly this, except that it does not yet work with Office 2007. It could presumably be used with Office 2003 and the OOXML add-in, to provide a way to convert OOXML to ODF in a single application. Some further notes on Sun’s converter here.

Considering Microsoft’s “rift with the web”

I enjoy the SmoothSpan blog but I’m not convinced by this article on Microsoft’s rift with the web.

Bob Warfield says:

Ever since their spat with Sun over Java, Microsoft has been on an increasingly proprietary path called .NET.

I am not sure why .NET is “increasingly” proprietary. Why is it more proprietary now than it used to be? Arguably it is less so; Mono is more advanced; and in addition Microsoft is going cross-platform with the CLR, by bundling it into Silverlight. That does not make it less proprietary in itself, but means that it is less closely tied to Windows.

Warfield does not quite say, but strongly implies, that .NET is failing in the market:

It’s symptomatic that you can find about 18 million Google hits on “SQL Server” but there are 77 million hits on mySQL.  There are 2+ billion hits for PHP and 135 million for Java.  C# gets a modest 15 million hits.

Right, so by the same logic PHP is vastly more important than Java. For some reason, I get different results on MySQL, which reports 171 million hits. Just for fun I tapped in Oracle, which gets only 105 million, inflated by all sorts of non-database references, so we must conclude that MySQL is far more important in the Enterprise than Oracle.

No, this sort of Google-diving is lazy analysis. Sure, the results are interesting, but they are skewed in all sorts of ways.

I am not suggesting that .NET is bigger than Java. Nevertheless, it has been a success story for Microsoft, particularly on the server which is the focus of Warfield’s comments. So too has SQL Server; in fact if I remember rightly, the server side of Microsoft has been showing healthy growth versus the more stagnant Windows/Office side of the business.

Look at what Netcraft is saying: in its October 2007 web server survey it show gaining market share for IIS and implicitly .NET technology, and has done for several months. Don’t take the Apache drop too seriously; Netcraft’s figures are skewed by the decision to remove Google’s servers from the Apache figures. Nevertheless, Microsoft seems to be growing its web business on the server side.

Jobs? I track these from time to time in the UK, and C# has shown remarkable growth since its introduction, partly at the expense of VB, but also versus Java. Yes, Java is bigger, but you would expect that.

Why has C# succeeded despite Java? Ease of use, productivity and tools. All of these can be debated; but there is some consensus about the excessive complexity of JEE, which has benefited Microsoft. I’ve also noticed innovations in C# being quietly adopted in Java. Given its false start with Java in the early days, I think Microsoft has done well to establish its new language.

Now, I do partially agree with Warfield. Microsoft is an island and I notice strong polarization when I attend conferences and the like: there is a Microsoft crowd and a non-Microsoft crowd. And I agree that the open source community builds largely on open source technology, within which Java is more widely accepted than .NET. However, the .NET island is relatively large and so far has proved resilient.

Should Microsoft drop .NET and embrace Java or PHP, as Warfield kind-of implies? No. There is no technical need for it, because .NET works well. It is not really a rift with the web, because it is server technology and actually plays pretty well with others, through web services for example. The key thing on the web is to be cross-platform on the client. Writely, acquired by Google, was a .NET product. Did anyone care? No; in fact I doubt many were even aware of it. Now Google has incorporated it into Docs and I should think it has been rewritten in Python or something. Few care about that either; but if it did not work properly on a Mac or in FireFox we would all hear about it.

I don’t mean to minimize Microsoft’s problems. More than any other company I can think of, Microsoft has difficulty in balancing the needs of its OS and desktop application business with the migration we are all making to the Web. Further, it has big PR and image problems, and poor market acceptance for Vista must be a headache. Yes, there is a Microsoft crisis brewing. I’d suggest though that the company can succeed best by building on .NET, not by abandoning it.

Technorati tags: , , , , , ,

A tale of two Adobe conferences

I am just back from Adobe’s MAX Europe. The previous Macromedia/Adobe conference I attended was Macromedia DevCon in 2002. Remarkably, the gold sponsors at the earlier conference included Microsoft, there to promote .NET technology to Dreamweaver designers. Such a sponsorship seems impossible now. Back in 2002, the big product announcement was Contribute, and its competition was FrontPage. Today, it’s war. Adobe is talking “platform”: hosted services, web applications, desktop applications, and none of it dependent on Windows; while Microsoft has suddenly got the cross-platform habit with its own Flash-like browser plug-in called Silverlight. On Adobe’s side, an amazing, ubiquitous, graphically-rich runtime that just works. On Microsoft’s side, huge resources and armies of .NET developers.

Max Europe was a good conference. There’s a buzz around the products, and I didn’t meet any disappointed delegates, although there was a little bit of concern that strong designer content was getting squeezed out by the new focus on developers. The Adobe speakers seemed very approachable, and I appreciated the willingness of senior executives to talk to the press. In fact, the company has retained something of a small company feel, at least among the ex-Macromedia team which seemed to dominate at MAX. Adobe also has a clearer focus than Microsoft, which comes over as more bureaucratic and internally conflicted.

Nevertheless, it is possible that some at Adobe are under-estimating Silverlight. One speaker assured us that it only runs in one browser (false). Flex Builder is slow and awkward in comparison to Visual Studio. Adobe does have a big advantage in mobile devices – Nokia was at MAX and is putting Flash in all its high-end phones – but I am not yet convinced of the merits of Flash Mobile.

Mac count at MAX: about 50-50 with Windows on a very rough estimate. That’s proportionally fewer Macs than at FOWA earlier this month, which was maybe 80% Apple.

Now I understand what a rich internet application is

For a while now I’ve been puzzling over what exactly is meant by the term “Rich Internet Application” or RIA. Microsoft wants the initials to stand for “Rich Interactive Application” but it is losing that battle – see this great post by Dare Obasanjo. It is Adobe’s term, but it has never been clear to me exactly what it means. I’ve seen it refer to everything from internet-connected desktop applications, to Flash applications running in the browser, or even plain old HTML and JavaScript.

The way to understand a term is to look at its origin, and here I got a big clue from Adobe’s Chief Software Architect Kevin Lynch. At a press briefing during Adobe Max Europe last week, Lynch described what happened:

The whole move of Adobe to rich internet applications was actually driven by the community. It was people using the Flash player about 2001, 2002, to start creating not just interactive media or animation experiences, but application experiences. The first one at that time was something called the Broadmoor Hotel reservation system. It was a 5 or 6 page HTML process to check out and they were having a lot of drop off. They turned that into a one-screen check out process in Flash, and they saw their reservations increase by 50%. We actually named that trend. We thought OK, we can do more to support that, and we called it Rich Internet Applications. Then we focused on enabling more of those to be made with these technologies, so a new virtual machine in Flash player, the Flex framework, Flex Builder, all of that was driven by some of those early developers who were pushing the boundaries.

So there you have it. The Broadmoor hotel case study, which I recall seeing demonstrated at the 2002 Macromedia devcon, was apparently a significant influence on the evolution of the Flash player. The first press release about it was in November 2001. The case study is still online, and the application is still around today.

I don’t think we will get closer than this to a definition. Adobe will continue to use it to mean Flash applications; Microsoft will continue to try and de-brand it – the same way it tried to use “blogcast” in place of “podcast”, according to this article. I tend to agree that the concept is bigger than Adobe; but language is organic and cannot be so easily manipulated.

Flash, Silverlight the future of video games?

According to the BBC, gaming giant Electronic Arts is fed up with having to code the same game three, four or five times over. That’s the downside of the console wars – several incompatible systems.

The article says that streamed server-based games will be increasingly important.

A few observations. First, the PC is the nearest thing to an open platform right now, and it’s interesting that PC games typically cost around 30% less than those on the top consoles. For example, the hot new FIFA 08 typically sells for £40.00 on PS3 or Xbox 360, £25.00 on PC. It’s cheaper on DS or PSP, but must be considerably cut down on these low-powered devices. The Wii is somewhere in between.

Second, I’m writing this after seeing the amazing things being done with Flash. Microsoft’s Silverlight is also interesting in this context, as is Canvas 3D – OpenGL running in the browser.

That’s still three separate platforms; but since they are all cross-platform, there would be no necessity to code for more than one of them.

Third, Flash games are already very popular. If you calculate market share by time spent playing, I guess Flash games would already show a significant portion (I’d be interested to see those figures).

Fourth, the success of the Nintendo Wii proves that although geeks care deeply about who can shift pixels and calculate transforms the most quickly, the general public does not. All they want is a playable and enjoyable game.

All this suggests that the business model behind Microsoft’s and Sony’s console strategy is flawed. The idea is to buy market share by subsidizing the hardware, then profit from the software sales to your locked-in users. What if users can get the same games by subscribing, say, to a hypothetical EA Live, and play the games on a variety of devices? The money is still in the software, but there is no hardware lock-in. Prices could fall, and game developers could spend more time being creative and less time re-implementing the same game for different platforms.

Flash is actually in the PS3 and PSP, but appears to be an old version. If Microsoft isn’t thinking about Silverlight for the Xbox 360, then it should be. But if my logic is correct, then the investment Microsoft and Sony have put into game studios is actually more valuable, long-term, than the money they have put into hardware.

That said, the online experience is not yet good enough to threaten the consoles. I doubt it will be long though. A key point is hardware acceleration in the Flash player. H.264 video will be hardware-accelerated in the forthcoming Moviestar release of Flash 9. I am confident that a hardware accelerated gaming API will not be far behind.

The Who: another take on how to sell music online

The rock stalwarts in The Who have come up with their own scheme for selling music in the Internet era.

Fans are invited to join a subscription scheme from November 5th. For a fee of $50.00 per annum, you get an exclusive live CD, access to an online forum, streaming video of concerts “from every Who generation,” and access to the band’s entire back catalog online:

Every Song on Every Album (b-sides too!) … As a Wholigan, you’ll be able to listen online to Who tracks, then add them to your mp3 player, if you like. (This feature will be available in 2008).

We are not told key details like in what sort of quality these media files will be delivered.

Is this a winner? If you consider that Radiohead is asking more than $50.00 for its (currently) internet-only CD and LP package, the Who’s deal is not bad, especially if the downloads are of good quality. It strikes me that some fans will join just for one year, to get the CD and to download songs they do not already own. It is a better deal that David Bowie’s similar arrangement with Bowienet – free double CD and site access for $64.99 , but no videos or back catalog access.

Even so, this kind of arrangement is only going to work for a small niche of diehard fans. It is implausible that music lovers would stump up $50.00 or more per year for every artist they enjoy.

I’m glad though that artists are experimenting with different ideas for distributing their music, and not letting Apple call all the shots.

Gutsy Ubuntu and Precipice Computing

The good news: I’ve successfully upgraded two machines from Ubuntu 7.4 (Feisty Fawn) to the new 7.10 (Gutsy Gibbon). I followed the instructions here. The bad news: neither upgrade was without incident.

I’ll start with the server. I use this for SlimServer and for experimenting with interesting Linux-based software; it has no GUI installed. Towards the end of the upgrade I got this message:

Message saying Could not install the upgrades. The upgrade aborts now. Your system could be in an unusable state. 

Not good. I call this “precipice computing”. In the UK a few months back there was some fuss about “precipice bonds”. These are a type of savings bond that guarantees at least your money back, unless certain conditions are met, usually to do with stock market growth. The conditions do not look likely to occur, but if they do, all bets are off and you could lose heavily.

Computing is like this sometimes. You tinker with your system and safe, user-friendly options guide you every step of the way. Except that under certain circumstances they do not, and then you may be deep in the mire.

It turned out to be not so bad. Ubuntu automatically ran dpkg, a package management tool. It reported some dependency issues and suggested how I might fix them. This worked. It is all because I have been messing around with Fuppes, a promising UPnP media server that is not quite done yet. I had to compile this manually, which entailed installing a bunch of multimedia development packages, and it was two of these that tripped up the upgrade. I doubt this would have happened on a production server, and in any case one would not upgrade a production server so soon and so casually. Even so, it was a scary message.

How about the other PC? This one is a Toshiba laptop which I have written about before. I had it running sweetly, and there was really no need to fiddle with it, except that I need to try new stuff for my work. I ran the upgrade. I was presented with some difficult dialogs offering to remove “obsolete” packages. Naturally I had no idea whether these were really obsolete or not, but I allowed the upgrade to remove them on the grounds that I could always put them back later if necessary.

All went smoothly until the inevitable restart. Unfortunately the machine would not longer boot. It reported “Drive does not exist”, if I remember rightly. Fortunately I had seen this before. The upgrade restored the same wrong settings that it used on initial installation, and I had to edit the grub boot menu.

After that is was fine, except for a disappointing lack of 3D desktop effects, normally the most visible new feature in Gutsy. The desktop had gone a slightly deeper shade of brown (I don’t much care for Ubuntu brown) but otherwise little seemed to have changed. The Appearance Preferences did not offer anything exciting, like the rotating 3D cube effect when switching desktops.

I investigated. I went into the Synaptic Package Manager and installed compizconfig-settings-manager, following a tip from the Ubuntu forums. That helped; I now have an option called Advanced Desktop Effects Settings, and can select the Desktop Cube and more. Something is not quite right though. After the upgrade, I only had one workspace instead of 4. Apparently there is an interaction between the Workspace Switcher and the Compiz desktop effects. To add workspaces when Compiz is running, it seems you have to use the General Options in the Advanced Desktop Effects dialog, under Desktop Size. I set this to 4, then restarted the X server.

Now I had 4 desktops, and could sometimes, but not always, switch between them with a rotating cube effect. What was odd is that I actually seemed to have more than 4 desktops, but could not switch between all of them using Ctrl-Alt-Arrow. To do this I had to use the Workspace switcher. Even then, I managed to get into a state where I knew Open Office was running, but could not switch to it at all.

At this point I reminded myself that I don’t much like the 3D cube effect anyway. Disabling it again was no great loss.

Just a few minor problems, then. The next question: does the upgrade deliver anything of value? I’ll let you know.

Adobe: friend or enemy of open source, open standards?

I’m sitting in a session at Adobe Max Europe listening to Senior Product Manager Laurel Reitman talking about what a great open platform Adobe is creating. She refers to the open sourcing of the Flex SDK; the open bug database for Flex; the ISO standardization programme for PDF; the donation of source code to Tamarin, the Mozilla Foundation ECMAScript 4.0 runtime project, and the use of open source projects such as SQLite and Webkit within AIR, the Adobe Integrated Runtime which lets you run Flash applications on the desktop, and the fact that AIR will run in due course on Linux, though the initial release will be Mac and Windows only.

So is Adobe the friend of open source and open standards? It’s not so simple. Adobe is more successful than any other company in promoting proprietary standards on the Internet. It ceased development of the open SVG standard for vector graphics, in favour of the proprietary Flash SWF. Adobe’s efforts may well stymie the efforts of John Resig and others at Mozilla to foster open source equivalents to Flash and AIR. View the slides of his recent talk, which include video support integrated into the browser, a canvas for 3D drawing, HTML applications which run from the desktop without browser furniture, and web applications which work offline. Why is there not more excitement about these developments? Simply, because Adobe is there first with its proprietary solutions.

Adobe is arguably more a consumer than a contributor with respect to open source. It is using the open-source Eclipse for Flexbuilder and Thermo, but as far as I can tell not doing much with existing open source projects within Eclipse, preferring to provide its own implementations for things like graphics and visual application development. It is using SQLite and Webkit, and will no doubt feedback bugs and improvements to these projects, but they would flourish with or without Adobe’s input. Tamarin is perhaps its biggest open-source contribution, but read the FAQ: Adobe is contributing source code, but not quite open-sourcing its ActionScript virtual machine. The Flash Player itself remains closed-source, as do its binary compilers.

Like other big internet players, Adobe is treading a fine line. It wants the world to accept its runtimes and formats as standards, while preserving its commercial advantage in controlling them.

My prediction: if Adobe succeeds in its platform ambitions, the company will come under pressure to cede more of its control over those platform standards to the wider community, just as Sun has experienced with Java.

Adobe shows how anything can be a web application

The closing session here at Adobe MAX Europe was a series of “sneak peeks” at forthcoming technology, presented with a disclaimer to the effect that they may never appear commercially. I am not going to do a blow-by-blow account of these, since it was mostly the same as was shown a couple of weeks ago in the USA, and you may as well read one of the accounts from there. For example, this one from Anara Media, if you can cope with its breathless enthusiasm.

So what was interesting? Overall, Adobe is doing a good job of challenging assumptions about the limitations of web applications, and I am not just talking about AIR. A few years ago you might single out something like Photoshop as an example of something that would always be a desktop application; yet this evening we saw Photoshop Express, a web-hosted Photoshop aimed at consumers, but with impressive image manipulation capabilities. For example, we saw how the application could turn all shades of one colour into those of another colour, so you can make a red car blue. Another application traditionally considered as local-only is desktop publishing, yet here we saw a server version of InDesign controlled by a Web UI written in Flex.

The truth is, given a fast Internet connection and a just-in-time compiler anything can be a web application. Of course, under the covers huge amounts of code are being downloaded and executed on the client, but the user will not care , provided that it is a seamless and reasonably quick experience. Microsoft should worry.

We also got a glimpse into the probable future of Adobe Reader. This already runs JavaScript, but in some future version this runtime engine will be merged with ActionScript 3.0. In addition, the Flash player will be embedded into Adobe Reader. In consequence, a PDF or a bundle of PDFs can take on the characteristics of an application or an offline web site. A holiday brochure could include video of your destination as well as a live booking form. Another idea which comes to mind (we were not shown anything like this) is ad-supported ebooks where the ads are Flash videos. I can see the commercial possibilities, and there are all kinds of publications which could be enhanced by videos, but not everyone will welcome skip-the-intro annoyances arriving in PDF form.

This was a fun and impressive session, and well received by the somewhat bedazzled crowd of delegates.

BBC to use Flash, Adobe streaming for iPlayer

Adobe’s Chief Software Architect Kevin Lynch announced today at Adobe MAX Europe that the BBC will use the Flash runtime for its iPlayer application, which enables UK viewers to download and play broadcasts for up to a week after their initial airing. In a short announcement, he said that the BBC will use Adobe’s technology end to end, from streaming to the cross-platform player on the client.

This appears to be a setback for Microsoft, whose technology is used in the controversial iPlayer currently in beta. It is unfortunate that the existing iPlayer is based on Windows Media Player components, rather than the new cross-platform Silverlight component which would be more suitable. The BBC has endured a hail of protest concerning iPlayer, based mainly on its Windows-only implementation, but also on installation hassles and annoyances arising from the Kontiki peer-to-peer technology which it uses. See here for my own experience.

However, Adobe’s press release suggests that the Microsoft iPlayer is not dead:

The BBC iPlayer on-demand streaming service will complement the download service currently available.

On the other hand, its seems odd that the BBC would use both a Windows-only and a cross-platform player technology. My hunch is that if the Adobe solution works as smoothly as the Flash player usually does, then the Microsoft-based service is likely to wither. I’ll be teasing out more detail on this later today.

There are a few more clues in this BBC story:

The BBC has also confirmed that users of Apple Mac and Linux machines will be able to use its TV catch-up service from the end of the year.

The broadcaster has signed a deal with Adobe to provide Flash video for the whole of the BBC’s video services, including a streaming version of its iPlayer.

Technorati tags: , , ,