All posts by onlyconnect

Microsoft: Live Mesh or Live Mess? Here’s what to read.

Here’s what I suggest you read to get to grips with Live Mesh:

Amit Mital’s introduction (he’s the General Manager)

Mike Zintel’s Live Mesh as a Platform (he’s Director of Service Infrastructure)

Mary Jo Foley’s Ten things to know and the helpful stack diagram.

I have a few initial comments. First, it’s the platform that matters, not the Live Desktop which is the first thing Microsoft is delivering and which you will find presented at mesh.com. Microsoft is finally showing us what it means by the “software plus services” thing it has been talking about for so long. It involves a new “Mesh Operating Runtime” which has both cloud pieces and client pieces, a MeshFX API, and an identity system which is Live ID (formerly Passport).

As far as I can tell, Microsoft is delivering an API which we will be able to use to build internet-based data, document and configuration into either desktop or web applications, with synchronization to local storage for offline use. Zintel adds:

… customers will ultimately license applications to their mesh, as opposed to an instantiation of Windows, Mac or a mobile account or a web site.  Such applications will be seamlessly installed and run from their mesh and application settings persisted across their mesh

It sounds good, though the obvious question is whether Microsoft is overstating the importance of the client in an attempt to preserve its core market. Do we need this special client piece? Here’s a paragraph from Zintel’s piece that caught my eye:

A key design goal of the Live Mesh data synchronization platform is to allow customers to retain the ownership of their data that is implicit with local storage while improving on the anywhere access appeal of the web. The evolution of the web as a combined experience and storage platform is increasingly forcing customers to choose between the advantages of local storage (privacy, price, performance and applications) and the browser’s implicit promise of data durability, anywhere access and in many cases, easy sharing.

Can Microsoft improve on the “anywhere access appeal of the web? Zintel says we need to combine it with the advantages of local storage, but the advantages Zintel identifies are not all that convincing. Let’s look at them:

Privacy: maybe, but local data is vulnerable to worms, trojans, viruses; well secured Internet data accessed over SSL is arguably more secure. Data not connected to the Internet is nice and secure, but can’t participate in the Mesh.

Price: I don’t see how Mesh helps here. Yes, local storage is cheap, but as soon as data enters the Mesh it is on the Internet and we are paying for data transfer as well as possibly Internet storage. I realise that Microsoft (among others) offers generous Internet storage for free, but that is just a way of buying market share.

Performance: Granted, some types of application work faster with local storage. Still, there are non-Mesh ways of getting this from web applications in a fairly seamless manner, such as Google Gears or Adobe’s AIR.

Applications: This is perhaps the big one. Many of us are reluctant to do without traditional local applications such as Office. Well, mainly Office. Still, web equivalents get better all the time. One day they will be good enough; and new technology like Silverlight is bringing that day closer. 

What about identity management and permissions? Zintel says:

A side effect of the competition to store customer data in the cloud and display it in a web browser is the fragmentation of that data and subsequent loss of ownership. Individual sites like Spaces, Flickr and Facebook make sharing easy, provided the people you are sharing with also use the same site. It is in fact very difficult to share across sites and equally difficult to work on the same data across the PC, mobile and web tiers.

True; but Mesh currently identifies users by their Live ID. Isn’t that the same as Spaces?

If Microsoft delivers a bunch of useful web services, that’s great. If it tries to somehow replace the web with its Mesh, it will fail.

Mary Jo Foley also asks the question: to what extent is Microsoft extending, and to what extent is it replacing, existing Live services such as Office Live or the excellent Skydrive? Making sense of all this is a challenge.

Now let’s mash all this up with Yahoo! (maybe). Ouch.

CDs,DVDs still account for more than 75% of music industry revenue, but shrinking

The RIAA has posted 2007 year-end manufacturers’ unit shipments and value statistics. This covers USA music and music video sales.

Here are my highlights:

  • CD sales down 17.5% in units and 20.5% in value year-on-year. The peak was 2000; value is down around 44% since then (actually a smaller decline that I would have expected).
  • Downloads are up 38.9% in units and 43.2% in value.
  • Mobile (ringtones etc) are amazingly only 30% less by value than other digital download sales, if I’m reading the chart right.
  • Revenue from physical media (mostly CDs) is more than 75% of the total.
  • Overall revenue is down by 11.8% – that’s the figure that hurts the music industry.
  • Finally, vinyl is up by 46% but from a small base; it’s only a tiny niche.

Every time I look another CD shop has shrunk or closed on the high street, so I expect there will more of the same next year.

Technorati tags: , , ,

How to code better: new book from ThoughtWorks

Which is more important: the code you write, or the tools you use? Instinctively I would say it is the code which really counts; but vendors tend to focus on tools because that is what they sell. I found myself thinking about this last week, because while I was engaged in puzzling out the new Application Factories feature in CodeGear’s JBuilder 2008 a book arrived on my desk. It is called The ThoughtWorks Anthology, and subtitled “Essays on Software Technology and Innovation.” The title is the worst thing about it, since it sounds like some kind of corporate puffery from ThoughtWorks (and I suppose it is); but it is also a concise, enjoyable and stimulating read. I found it particularly refreshing as a break from JBuilder, since it it concerns stuff that matters: code, testing, project management, as opposed to stuff that might introduce more problems than it solves: the latest wonder tool which promises to speed development but might end up as just another piece of clutter.

I turned first to an entertaining chapter by Jeff Bay, called Object Calisthenics. Bay sets out nine rules; I don’t want to steal all his thunder, but here are three of them:

  • Don’t use the else keyword
  • Use only one dot per line
  • Don’t use any classes with more than two instance variables

Before you scream at him, note that this is an exercise, not a set of rules for live projects, though at the end of the chapter Bay says they can be used as such. It doesn’t matter whether you agree or not; what counts is that it makes you think about how to improve the design of your code.

Martin Fowler writes about how to write a DSL (domain-specific language) in Ruby. Rebecca Parsons describes how to categorize languages. Neal Ford makes the case for polyglot programming. Erik Doernenburg discusses the use of annotations/attributes in Java and .NET for application domain metadata. Julian Simpson has a great chapter on refactoring Ant build files, which can easily become hard to maintain. Two final chapters cover test strategy and performance profiling strategy.

I particularly like the way in which the book itself reflects the coding principles it presents. Each chapter is short, clear and focused. A good read for your next transatlantic flight.

The ThoughtWorks Anthology (Pragmatic Publishing ISBN 10: 1-934356-14-X)

Amazon links:

US: The Thoughtworks Anthology

UK: The Thoughtworks Anthology

Britannica going more towards free

Towards free, not completely free. This is a fascinating example of old-school vs new school (Wikipedia and Google – I mean Google search, not the Knol thing). Britannica is opening up its content to “online publishers”, including qualifying bloggers – low traffic is OK, but infrequent posting is not. The idea is to encourage these users to post Britannica links on their sites. Such links will bypass the paywall, enabling non-subscribers to read articles that would otherwise require subscription.

We can debate the quality of Britannica’s more scholarly articles versus Wikipedia’s living encapsulation of crowd wisdom. The real question is this: what is Britannica’s business model when something that many people will feel is “good enough” is available for nothing?

Here’s what the FAQ says:

Won’t you lose money giving away all those subscriptions?

We don’t think so. On the contrary, with many Web publishers using our products and sharing them with our readers, we expect to see a lot more people subscribing.

On the other hand, might not existing subscribers feel that the value of their subscriptions is diminished by the giveaway?

I suspect this is an attempt to rebuild its brand and experiment with different business models, such as advertising.

Prediction: in time, Wikipedia will include more attributed and locked content, while Britannica will add user comments, ratings, and even entirely user-generated articles (marked as such, of course). In other words, they will converge. The winner will be the site with the most traffic. If I’m right, Britannica’s new initiative is right, but very late in the day.

As an aside, I thought this part of the FAQ was not very Britannica-like:

I blog regularly, but I don’t have much traffic. Will that disqualify me?

Nope. You need Britannica more than anybody. Start reading it, and your posts will burn with brilliant, scintillating insights; link to Britannica articles, and readers will be eternally grateful. Your traffic will soar.

Technorati tags: , ,

RIA means … not much

Ryan Stewart has a go at nailing what the term Rich Internet Application means.

I think he’s coming at this from the wrong end. It’s better to look at the history.

As far as I’m aware – and based partly on my own recollection, and partly on what Adobe’s Kevin Lynch told the press at last year’s MAX Europe – the story begins around 2001, when WebVertising Inc created an online booking application for the Broadmoor Hotel in Colorado Springs. It was an HTML application redone in Flash. A PDF describing what was done is still online, and discusses some of the differences between the HTML and Flash approach, though bear in mind this is Flash evangelism.

They created iHotelier, a fully interactive, data-driven reservation application that reduces the entire reservation process down to a single screen. Users looking for information on available rooms for specific dates highlight their preferred dates in a calendar. With one click of the mouse, the Flash application displays the available (and unavailable) rooms, and their cost. (Figure 10) As a result, users do not feel like they’ve wasted a lot of time and effort if their first room choice is not available.

This case study seemed to trigger a new awareness at Macromedia concerning the potential of Flash for complete applications. I don’t mean that it had never been thought of before; after all, it was Macromedia that put powerful scripting capabilities into Flash, and I’m sure there were Flash projects before this that were applications. Nevertheless, it was a landmark example; and it was around then that I started hearing the term Rich Internet Application from Macromedia. Wikipedia claims that this paper [PDF] is the first use; it’s by Jeremy Allaire and dated March 2002. I’m sure Allaire himself could provide more background.

The problem with the term, as you can see from Allaire’s paper, is that Macromedia (now Adobe) tends to define it pretty much as whatever their latest Flash technology happens to be. This shifts around; so if you are at an AIR event, it’s AIR; if you are at a Flash event, it’s Flash; if you are at a Live Cycle event, it’s apps that use Live Cycle.

Microsoft muddied the waters a little. Realising that RIAs were attracting attention, it started using the term to describe its own technology too, though in the spirit of “embrace and extend” it changed it to mean “rich interactive application”. As I recall, Microsoft used it mainly to describe internet-connected desktop client applications such as those built with Windows Forms. Something like iTunes is a great example (even though it is from Apple), since it runs on the client but gets much of its data from the Internet, especially when you are in the iTunes store.

Now it remains a buzzword but honestly has little meaning, other than “something a bit richer than plain HTML”. If you were doing the Broadmoor Hotel app today, you could do it with AJAX and get similar results.

Technorati tags: , , ,

mvn cloudtools:deploy

I love this. Write your Java EE app; then deploy to up to 20 virtual servers like this:

mvn cloudtools:deploy

The servers are Amazon EC2 instances, charged by the hour.

The tool comes from Chris Richardson, author of POJOs in Action. It combines a Groovy framework called EC2Deploy with appropriate Amazon virtual machine images and a Maven plugin. He calls the combination Cloud Tools. More on EC2Deploy here. The Cloud Tools home page is here. Open source under the Apache License 2.0.

Great for testing, could be good for live deployment too, especially now that you can get proper support from Amazon.

See also Jeff Barr’s Amazon Web Services Blog.

WordPerfect X4: not good at PDF, OOXML, ODF import

I don’t envy anyone trying to sell a WordProcessor or Office Suite that is neither Microsoft Office, nor free. Corel has just released WordPerfect X4, which it is promoting as a PDF editor. Both Open Office and Microsoft Word 2007 can save in PDF format, but WordPerfect can open PDFs as well. That could be a handy feature, though PDF was conceived as an output format so arguably it is not a big deal. In any case, you may be less keen on the idea once you read online help, as opposed to the marketing blurb, which explains that WordPerfect does not preserve the formatting of most PDF documents. If it’s an honest PDF, you get something editable but possibly different:

The layout in the imported PDF may be different from the layout in the original PDF, but you can still modify text strings and create a new document without having to copy or redesign all the elements.

If the PDF contains images of text, WordPerfect uses OCR to scan the images and generate editable text. Again, that could be handy, but if you think you can use WordPerfect to open an incoming PDF, make a few changes, and send it on its way, think again.

Let me add that all my attempts to import a PDF into WordPerfect have failed. I installed the trial, and tried to open the first PDF I came across – a 20 page Forrester report. WordPerfect whirled way at 25% CPU and using over 1GB RAM (I have lots installed), eventually offering a blank document. I tried again, and it crashed. Finally, I started a new document, typed the word “Test”, and exported it to PDF. Then I tried to open it in WordPerfect – nothing. It opens fine in Acrobat. I guess something is broken in my install.

Personally I am more interested in its support for OOXML (or possibly OXML), the native format in Microsoft Office 2007 and the subject of contentious ISO standardization. WordPerfect X4 has the cheek to make itself the default editor for .docx, .xlsx, and .pptx. Again, I opened the first document I came across, which is an 8 page Q&A, very simple, no images. The good news is that it opened. The bad news: plain text became bold, paragraph spacing disappeared, and the result looked worse than the original.

Next, I tried a document with a more complex layout. This is actually a bidding card for use in Duplicate Bridge: you can find similar ones here, but in .doc format – mine was one I had amended and saved as .docx. WordPerfect opened it, but with the layout completely messed up. Graphics were lost. I tried opening the old .doc version. Better, but still not right. It spread the document over 5 pages, a shame when it is meant to be printed on two-sided A4 to make a card. Open Office on the other hand could handle the .doc version nicely; I was impressed.

That gave me an idea for a further torture test. Open the bidding card in Open Office, save in ODF format, which WordPerfect X4 is also meant to support. Now open the .odt in WordPerfect X4. It crashed.

WordPerfect X4 may have all sorts of good points as a general Office Suite, but what about this claim in the press release [PDF]:

File Format Freedom
In addition to its significant PDF enhancements, WordPerfect Office X4 now provides suitewide  compatibility with Microsoft Office 2007 files (OOXML) and Open Document Format (ODF) in WordPerfect X4. With PDF-reading software installed on more than 80% of all U.S. PCs (Source: Jupiter Research), WordPerfect Office X4 enables users to collaborate and share files more broadly and more effectively than ever before.

Hmmm.

Update

I did a bit more experimentation. It turned out that the worst case (in terms of messed up formatting) involved a document which had originally been pasted from HTML. I imagine it was a bit of a mess internally, so perhaps one should excuse WordPerfect (though users don’t understand these distinctions). I reconverted a Bidding Card document and this time WP did better. Here are some images. First, a portion of the document in Word:

Now, here’s the doc imported into WordPerfect X4. Not right, but looks fixable:

Here it is in Open Office:

Identical to the Word rendering as far as I can see. Then I saved as .odt and opened in WordPerfect X4:

 

Windows Server 2008 is better than Vista, but why?

Mark Wilson asks:

It seems that, wherever you look, Windows Server 2008 is almost universally acclaimed. And rightly so – I believe that it is a fantastic operating system release (let’s face it, Windows Server 2003 and R2 were very good too) and is packed full of features that have the potential to add significant value to solutions.

So, tell me, why are the same journalists who think Windows Server 2008 is great, still berating Windows Vista – the client version of the same operating system codebase?

The short answer is that Server 2008 delivers new features that customers wanted, whereas Vista delivers new features that Microsoft thought its customers should want. However, it seems there may be more to it than that. Maybe Server 2008 really does perform better than Vista.

According to this post, Server 2008 performs 11-17% faster than Vista SP1, running a couple of benchmarks which test typical client applications. Christian Mohn concurs:

Windows Server 2008 performs better, even with the Aero features enabled, than Vista ever did on the same hardware. To me, this a bit strange, even if a lot of services are still disabled, as the codebase is pretty much the same as Vista.

though Mohn’s example is less scientific: he never ran Vista SP1, and also moved from 32-bit to 64-bit.

Server 2008 has a “Desktop Experience” feature, which installs things like Windows Media Player, Aero GUI effects, and other fluff that doesn’t belong on a server. My assumption had been that once you installed this, Server 2008 would perform in a similar manner to Vista. Apparently this is not the case.

It seems to me there are a few possibilities. One is that Microsoft isn’t being straight with us about this “same codebase” stuff. It would be interesting to analyze the core DLLs and work out which are the same, and which are different.

The second possibility is that there’s stuff in Vista which is not part of the core, nor part of the Desktop Experience, but which slugs performance. If so, it would be great to identify it and turn it off.

The third explanation is that the testers are wrong, and that performance is actually similar. For example, maybe Vista was running a background update or backup during tests. Background processes make it hard to conduct truly rigorous performance comparisons.

I’d like to see Mark Russinovich get his teeth into this. I’m also tempted to try the Server 2008 desktop experiment myself.

Giving up on the mobile web

Mowser, a start-up which provides a service that makes web sites mobile-friendly, is giving up. Founder Russell Beattie says:

I don’t actually believe in the “Mobile Web” anymore … anyone currently developing sites using XHTML-MP markup, no Javascript, geared towards cellular connections and two inch screens are simply wasting their time.

His point is that devices are adapting to enable browsing of the full web, making attempts to adapt the web to devices rather pointless. Which is pretty much what I said six months ago.

This isn’t absolute. As a mobile web user, I’ve appreciated mobile versions of sites like Google or the BBC. It soon becomes frustrating though, because so many sites are not designed to work well on mobile browsers, and never will be. Fix the mobile browser, and you get the lot.

Technorati tags: ,