Free McFly CD in Mail on Sunday shows media powershift

Today’s Mail on Sunday in the UK has McFly’s new CD, Radio:Active, as a free insert. Free CDs and DVDs are like confetti in the weekend press these days; but this one is distinctive in that it is new material and comes from a band not yet in the twilight of its career – insofar as these things can be predicted.

The band has its own label, called Super Records, which gives them the freedom for this kind of experiment.

Not so long ago, the CD was itself the upsell – music companies would give away other stuff in order to promote the CD. Now that’s changed; free exchange of digital music has undermined the value of CDs, and McFly has figured out that promoting the band is more important.

But what is the new upsell? Performances, tat, ringtones, digital audio and video downloads. Trouble is, it’s not going to sum to as much as CD sales did in the old days.

Oh well, there’s always the deluxe CD and DVD package, with four extra tracks and a booklet, coming in September. Perhaps not so much has changed after all (except it has).

And the music? Lightweight pop, compressed to hell. No complaints about the lightweight pop; but the sound quality is much worse than on the first and second CDs, Room on the Third Floor and Wonderland.

Technorati tags: ,

Testing a web service with IIS 7 on Vista

Not long ago I created a simple CRUD example using Silverlight 2.0 beta 2. I used Visual Studio 2008 and the ASP.NET Development Server. I wanted to test the same WCF web service with a different client (more on that soon), so I decided to deploy it to the instance of IIS 7.0 which comes with Windows Vista. I created a new web site on a different port than the default.

Nothing worked. Reason: although I have installed .NET 3.5 SP1 beta and Vista SP1 – which should do this automatically – the IIS 7 mime types and  handler mappings were not configured for Silverlight and WCF. How to fix the mime type is here and the handler mappings, here.

The web service still didn’t work. I got:

A first chance exception of type ‘System.ServiceModel.ProtocolException’ occurred in System.ServiceModel.dll.

I changed the debug options to break on all managed exceptions, and got this further detail:

The remote server returned an unexpected response: (404) Not Found.

Problem: Silverlight is looking for a cross-domain policy file.  The reason was that at this point I was still running the Silverlight app from the ASP.NET Development server, and it considered IIS to be on a separate domain. The 404 error does not make this obvious; but a quick Google for Silverlight 404 shows that this is a common problem.

Silverlight is designed to support cross-domain policy files in either Microsoft’s format (clientaccesspolicy.xml) or Adobe’s format (crossdomain.xml). If the service is just for Silverlight, use Microsoft’s format; otherwise I suggest adding both.

Nearly there; but I still had to fix SQL Server authentication. I normally use Windows authentication, and if you are using the ASP.NET Development server this just works. Move to IIS though, and it does not work unless you set up ASP.NET impersonation, or create a SQL Server login for the account under which the application pool is running. Oddly, when I tried the app without fixing the SQL Server login I still got a 404 exception; I’m not sure why.

Incidentally, I noticed that if you configure ASP.NET impersonation for a web site, the username and password gets written to web.config in plain text (bad). If you configure the application pool to run under a different account, the password is encrypted in applicationHost.config (better). In the end I decided to use good old SQL Server authentication.

One last tip: when debugging a web service, put the following attribute on the class which implements your ServiceContract:

[ServiceBehavior(IncludeExceptionDetailInFaults = true)]

Otherwise you get generic fault messages that don’t help much with debugging. Remove it though for release builds.

Once I’d fixed the SQL server login, everything was fine.

Why I can’t use Microsoft Live Search for real work

I’ve complained before about the quality of Microsoft’s Live Search vs Google; but today’s example seemed too good an illustration not to mention.

I needed to update Windows XP to SP3. In particular, I wanted what Microsoft calls the “network” download; that is, the entire service pack, not the launcher app that initiates a piecemeal download tailored to the specific machine.

I pulled up Live Search and entered “windows xp sp3 download”.

To my surprise, Live Search offered me only third party download sites in its first page of results. Actually, that’s not strictly true. At number 8 is the download for “Windows XP SP3 RC2 Refresh” (obsolete); and at number 10 the general home page for XP downloads:

Find popular Windows XP downloads, including PowerToys, trial software, tools and utilities

I tried Google. Same search. The official network download is in first place. The official piecemeal download is second.

I know: you can argue that this is just an isolated search, and that some other search might show Google in an equally bad light. However, I find this constantly: Google gets me much better results. Further, this case is particularly telling, since a third-party download site is not what you want when patching Windows. Quite likely those other sites do point you to the correct official download eventually; but getting Microsoft downloads from Microsoft’s site is safer.

I am not surprised Microsoft has a tiny share of the search market; and I don’t believe this is simply because of Google’s clever marketing.

Update PS: The above screen grab still matches what I get today. However, users in different countries may get different results; from the comments below I suspect that US users get better results in this instance. Maybe Live Search is worse in the UK than in the US; I’d be interested to know.

Technorati tags: , ,

Adobe Reader 9 brings AIR to the world

Adobe has released the free Adobe Reader 9. This includes an AIR application to support Acrobat.com, a document management and collaboration site now in beta. Since Reader gets installed on most of the world’s active computers, this strikes me as a significant moment for Adobe’s new desktop runtime.

The actual application is disappointing. It does not do the main thing laptop users would like it to do, which is to synchronize documents for offline use. Nor does it include a desktop version of Buzzword; the link simply opens your web browser. Finally, the obsession with Flash seems silly. I saved a document as RTF and uploaded it to Acrobat.com by drag-and-drop. Then I double-clicked the document in the list, hoping it would open in Word (or whatever editor is registered for RTF). No luck: the app creates a Flash preview, rather slowly.

In other words, to edit a document I have to download it and then open it. If I want to have the revised version in Acrobat.com, I have to save it and re-upload it.

What’s odd is that Buzzword can open RTF files. So why isn’t there an option to “edit in Buzzword”?

Even going first to Buzzword does not help. Click the Buzzword link, Buzzword opens in your web browser. Go to your documents: you get your Buzzword document store, not your Acrobat store. Wait – there’s an option “to access your other Acrobat.com files click here”. Click. Now I’m in Acrobat.com. Open. No luck: it’s still a Flash preview. Do I really have to download, go back to Buzzword, and re-upload to edit this online?

Microsoft’s SharePoint does this much better. I presume Adobe will fix the Acrobat.com – Buzzword integration, though going further and enabling smooth offline editing and saving in native applications such as Word may be too difficult.

Never mind. This is going to get AIR installed everywhere.

The sad story of the LG Viewty – case study in Web 2.0 failure

The LG Viewty (KU990) is a decent camera phone which came out last year in the wake of the first iPhone; yes it is me-too product but it has a few advantages over Apple’s product, like an 5 MP camera.

The trouble is, there are niggles, some minor, some major – like video recordings losing sound. A little over six months on, and users are posting messages like this:

I have to admit that I hate this phone. I have had mine six months now and I have regretted it for some time. It’s unresponsive, the camera is poor most of the time, it’s really awkward to use. It’s very slow at taking photos. It can’t hold a signal. The battery life is getting poor. It won’t handle many music file types very well. It feels like a Beta phone. The support from LG is non existent. I’m so disappointed. I look at apple, and yes, people did have to fork out for their phones, but look at the level of support and development they’re getting – new stuff is being added all of the time. It almost feels like they built this phone as a test for some of the features to go into other products.
The best thing about this phone is the video, and I love that part of it. I have some real magic moments captured and I’m grateful. But that is it.
I speak for myself, but I will never get another LG phone, period. As soon as I can get bought out of my contract the better.

It is a big change in mood from when the Viewty was released. This huge thread on the What Mobile forum has the story. Early adopters loved it – except for a few niggles which they hoped and expected would be fixed by a firmware update.

There has been no firmware update. Presumably all the software folk at LG have moved on to the next shiny device. Viewty users feel abandoned.

This seems like a good case study about not getting Web 2.0. Ironically LG made an effort to exploit social networking when the Viewty was launched. LG contacted bloggers and and offered phones for review; I reviewed it here. There is an official LG UK Blog – which sadly is pure marketing fluff and has done nothing to engage with the community over the issues which have been raised. There is an official Viewty website that has lots of Flash multimedia but little substance.

Yesterday I wrote about purchase decisions that begin with a Google search. Mobile phones are a good example. Anyone who does their Web homework will be put off the Viewty; and indeed deterred from newer LG models because the same thing will likely happen again. Network effects work both ways; even those who do not live on the Web will be influenced by opinion-formers who do.

It seems to me that a relatively small investment in communication and post-release software update and support would yield significant improvement in sales.

Technorati tags: , ,

Lively attack on Microsoft’s poor marketing – from within

Microsoft employee Kirk Allen Evans has a go at Microsoft’s marketing efforts:

I am so completely and utterly sick, as an employee and a Microsoft shareholder, of seeing empty spending on crap like "People_Ready".  Remember the completely ridiculous Office Dinosaur spots?  C’mon, marketing, grow a pair… let’s see some results.  No, I don’t want to see a retort ad making fun of the "I’m a Mac, I’m a PC" goons.  That ship has long since sailed.  Let’s see what all that Microsoft money and some of the smartest people in the world can come up with.

He’s right. So are the comments to his post, observing that marketing isn’t the only problem, or even the core problem.

Still, Vista is now actually better than its reputation. That’s a marketing issue.

Technorati tags: , ,

Web 2.0 for the rest of us?

We all know what Web 2.0 means. Google, Flickr, Facebook, Yahoo, mash-ups usually with Google Maps or Flickr, Salesforce.com, and anything but Microsoft. But what does it mean for everyday businesses, like some of the small businesses I talk to from time to time? Some are sceptical. One I can think of sells a successful software application but does not even run a support forum – why make it easy for others to discuss and publicise flaws and problems in your product?

I was interested therefore in a recent book by Amy Shuen, called Web 2.0: A strategy Guide. A foreword by Tim O’Reilly says, "it is the first book that really does justice to my ideas". It was O’Reilly who popularized the Web 2.0 concept – and yes, it is anO’Reilly book.

Shuen writes enthusiastically about network effects, using Flickr, Netflix, Google, Facebook, LinkedIn, Amazon and Apple (iPod/iTunes/iPhone) as case studies. I enjoyed it, but the problem with this kind of book is the chasm between these few web giants and everyone else. Another problem is the tendency to ignore the Web 2.0 graveyard – thousands of start-ups that fail, or moribund and/or spam-infested blogs and forums. Since there are more failures than successes, it would be sobering to investigate these rather than riding a wave of Web 2.0 hype. Nevertheless, it is a thought-provoking book with an extensive bibliography, and not a bad starting point for investigating Web 2.0 concepts. I liked the “five steps to Web 2.0”, which begin with finding collective value and end with perhaps the most important, which is what Shuen calls “recombining innovations”:

New-style click-and-mortar, online-offline network partnerships focus on bridging and building new networks rather than replacing or disrupting the infrastructures of offline companies.

I’ve also received a short Web 2.0 book by Marco Cantù, called The Social Web. It is a brisk tour of the sites and concepts that form today’s online communities. Typical readers of this blog probably won’t find anything new here; but I liked the common-sense tips on things like blogging and creating interactive web sites.

I would argue that almost all businesses either are, or should be, “click-and-mortar” entities. Whatever business you are in, a useful question is: what proportion of purchases in your sector begin with or include a Google search? If the answer is significant, you are in the Web 2.0 business.

That does not mean SEO (Search Engine Optimization) is the answer to everything. I am an SEO sceptic. All too often, SEO is lipstick on a pig. Optimise your web site for users, not robots. Further, it is no good trying to get users to interact with you, if you are not willing to interact with them. Surprisingly, I see this all the time. I suggest spending less time worrying about high Google ranking, and more time worrying about what users find when they do land on your site.

The case studies that interest me most are where old-style businesses have found ways to engage successfully with Web 2.0 innovations. For example, I’ve written about kitchons.com, which services domestic appliances and tunes its business via Google ads. I came across another example today: a financial company which lets you put an image from Flickr on your credit card. Clever.

Web 2.0: A Strategy Guide by Amy Shuen (ISBN: 978-0-596-52996-3). O’Reilly $24.99.

The Social Web written and published by Marco Cantù. $17.39 print or $8.70 electronic, from Lulu.

Technorati tags: ,

Changing models of journalism

Chris Green, editor of IT Pro, has written about analysing professional writers in terms of “costs per unique user visit”. He says:

I honestly believe that in the not too distant future, online publications in all sectors, not just technology, will have to adopt a results-driven approach to freelance commissions in order to maximise revenue and to achieve maximum return from their freelance budgets.

The most likely outcome will be that publications begin paying writers purely on how much traffic an article pulls in. Also likely is that commissioning editors will need to take a more frequent and brutal approach to deciding which freelancers to commission regularly and which to drop from their rotation, based on the kind of metrics I am currently looking at.

I write for several publications, print and online, and in every case I am paid per word. If this prediction is accurate, how will this affect me and others who write for money?

Green says writers will have to work harder at pulling in readers. He talks about search engine optimisation (SEO), link seeding, cross-linking, encouraging comments, and supplying photos and even videos as well as words (no doubt to the fury of pro snappers).

If writers are paid per view, clearly they will have more incentive to do such things. Best tread carefully though. Link seeding done badly is spamming. Encouraging comments done badly is trolling. SEO done badly is keyword madness.

Further, there must be a reason why writers rarely write their own headlines. Publishers decided long ago (in print and online) that writing attention-grabbing headlines, which is a kind of SEO, is a job best done by specialists. So is snapping pictures, designing page layouts, and marketing the results. Giving the writer more of these roles doesn’t make sense except for low-budget publications like, errm, blogs. It also gives writers less time for their core competency, which is researching and writing.

Another problem is that not all traffic is equal. If a publication is ad-funded, then the traffic that counts most is that from potential purchasers, those who approve budgets or click ads. Click ratios are easy to measure, but profiling readers per-article is harder.

I agree that the Web is changing journalism, mostly for the better. One of my reasons for starting and persevering with this blog is that I value its immediacy, the feedback from readers, and the comments from those about whom I blog. The quality of the writer-reader interaction is immeasurably better than in the old days of occasional letters printed ages after publication.

Further, I don’t think any writer should mind being paid in some sense by results. Book authors have always had to put up with (or enjoy) this approach.

The problem is how to measure those results. Pay per view sounds good; but it punishes writers who happen to get commissioned for less popular subjects. If those subjects are nevertheless ones that the publication wants to cover, that suggests scope for bargaining.

What about measuring quality? The Register now lets readers rate some articles from one to ten. Nice if you get a good score; but is this more a measure of excellence, or of what readers agree with?

Kudos to Chris Green for throwing this open for debate.

Technorati tags: , , ,