Amazon MP3 store is much cheaper than Apple iTunes

The Amazon MP3 store has arrived in the UK, and I’ve noticed that it is much cheaper than Apple iTunes for many items, particularly when buying complete albums. Here’s an example: Day & Age by Killers. £7.99 on iTunes:

and £3.00 on Amazon:

That’s 62% cheaper. Amazon also sells the CD for £8.98. Since you get more for your money with a CD (no lossy compression, physical backup and sleeve notes, transferable rights) that strikes me as about right.

The MP3 format is also more convenient than iTunes AAC, since it is supported by more devices.

I’m intrigued though. Why is Amazon so much cheaper? A last-ditch effort by the industry to create serious competition for Apple?

Technorati tags: , , ,

Windows Azure: since PDC, how is it going?

At the Professional Developers Conference 2008, held at the end of October 2008, Microsoft unveiled Windows Azure, its new cloud platform. I was there, and got the impression that this is a big deal for Microsoft; arguably the future of the company depends on it. It is likely that the industry will reduce its use of on-premise servers in favour of hosted applications, and if Microsoft is to preserve its overall market share it needs a credible cloud platform.

That was nearly two months ago. What’s been the developer reaction, and how is it going with the early tech previews made available at PDC? It’s hard to tell; but there is less public activity than I expected. On the official Azure forums there are just 550 messages at the time of writing; and glancing through them shows that many of them are from people simply having difficulty signing up. One of the problems is that access to the preview is limited by developer tokens of various types, and although Microsoft gave the impression at PDC that all attendees would have these, that has not really been the case. Those who attended hands-on labs at PDC got tokens there; others have had to apply and wait like everyone else. Part of the reason for lack of activity may just be that not many have been able to get in.

There are other issues too. I’ve spent some time trying out Live Framework and building applications for Live Mesh. I’ve written this up separately, in a piece that will be posted shortly. However, I found it harder than I expected to get good information on how to proceed. There is plenty of high-level marketing, but hands-on documentation is lacking. Azure may be different – though I was interested to find another user with similar frustrations (it’s worth reading this thread, as Microsoft’s moderator Yi-Lun Luo gives a handy technical outline of Azure and Live Services).

Still, let’s bear in mind that PDC is where Microsoft shares early technical information about the Windows platform, which is subject to change. Anyone who built applications for the preview Windows Longhorn code doled out at PDC 2003 (Paul Thurrott’s report is a reminder of what it felt like at the time) would have been in for some disappointment – Longhorn was both greatly delayed and much altered for its eventual release as Windows Vista.

It’s possible then that most developers are wisely waiting for the beta of Azure before doing serious experimentation. Alternatively – the bleakest outcome for Microsoft – they are ignoring Azure and presuming that if and when they do migrate applications to the cloud they will use some other platform.

Nevertheless, I’d suggest that Microsoft’s evangelism of Azure has been poor since PDC. There is more buzz about other things presented there – including Windows 7, which in contrast to Azure seems nearly done.

Update

Matt Rogers from Microsoft comments below that the service is not going to change radically between now and general release. He claims that feedback is extensive, but not evident in the online forums because it comes from other sources – he told me on Twitter that “we are getting much of it directly through relationships with customers, local user group meetings and through our evangelists”.

Maarten Balliauw has converted an application to Azure and written up the experience on his blog. He is using Azure TableStorage for data and Live ID for authentication. He says:

Overall, Microsoft is doing a good job with Azure. The platform itself seems reliable and stable, the concept is good.

Unfortunately the app itself does not work at the time of writing.

BBC iPlayer AIR app brings downloads to Mac and Linux

I’ve successfully installed the new BBC iPlayer AIR application on Windows, Mac and Linux – and I’m mostly impressed so far. The main snag is that you have to click the Labs tester button on a separate page before the  download works – but this isn’t mentioned on the download page. Another usability issue is that when you start up the app it invites you start downloading; you click the link, and the iPlayer web site opens in your default browser with no advice on what to do next. You have to find a programme which includes a download to computer link – most of them do not. I found a Roy Orbison documentary that worked (no, that’s not Roy Orbison in the pic, but another singer).

This was a better experience than early days with the old download iPlayer, though on Linux (Ubuntu Intrepid Ibex) I found that I needed to fiddle with the settings and allocate some disk space specifically before it would accept downloads.

An interesting aspect of the new iPlayer is that it replaces a peer-to-peer download system with a direct download. I discussed the implications of this at some length with both Anthony Rose at the BBC, and with a couple of ISPs, when I was researching an interview for the Guardian. In the end there wasn’t enough space to include much of this technical detail, though I’m hoping to post some of it in the near future.

A quick summary: the ISPs are not in favour of peer-to-peer because it is less efficient. Typically, all the retries cause approximately double the amount of data to be transferred (according to my source). That said, they don’t like the BBCs move towards Level 3 rather than Akamai, because it works out more expensive for them. ISPs could install their own box to stream the BBCs content, saving them operational money, but these apparently are expensive to buy and install; I was told that the iPlayer’s traffic does not yet justify it, but if it grows to say twice what it is now, it will become economic.

The biggest cost though is the last step, from the ISP to the user. This is where the cable companies (mostly Virgin Media) have a big advantage, since the cable goes to your doorstep, and is designed to accommodate digital broardcasts. ISPs that have taken advantage of local loop unbundling are also relatively well placed. Those that pay BT wholesale for the traffic are the most vulnerable.

The other important point is that there is always something you can do to manage increased traffic – though not necessarily quickly. If everyone in the UK suddenly tries to watch HD video at the same time, the system will seize up, but that won’t happen. What will happen is that increasing numbers of people will find that their cheap transfer-limited packages are no longer sufficient and they will need to upgrade.

Technorati tags: , , , , , ,

Vista’s mysterious compatibility settings: what do they do?

I hate this Program compatibility Assistant in Vista. Why?

First, because it applies settings whether you like it or not. There’s no option to say, “I was happy with how it ran, just leave it alone”.

Second, because it does not tell you what it has done. Sure, there is a link that says, What settings are applied? So you click it.

And you get a generic help dialog with six headings. You click the most promising: What changes does it make? It says:

It depends on the problem, but any changes made are related to how Windows runs the program. No changes are made to the program itself. For example, the Program Compatibility Assistant can resolve conflicts with User Account Control, a new security feature in this version of Windows that can help make your computer safer. Or, it can run the program in a mode that simulates earlier versions of Windows. The changes that Program Compatibility Assistant makes are done automatically, so you don’t need to make them.

Vague and uninformative. And that’s it.

So how do you really discover? Well, you could read this article. Then you can fire up RegEdit and look at:

Software\Microsoft\Windows NT\CurrentVersion\AppCompatFlags\Layers

under both HKEY_CURRENT_USER and HKEY_LOCAL_MACHINE. Here I found an entry for FlexBuilder.exe as follows:

OK, so Flex Builder will now have ELEVATECREATEPROCESS applied. What does that mean? Here’s the scoop:

Here the test program was trying to launch an updater which is required to run as administrator and failed. In this case, PCA will apply the ELEVATECREATEPROCESS compatibility mode, which will enable the program to successfully launch the child exe as administrator the next time. Now when the program is run the next time and while trying to launch the updater, it will not fail and will successfully run as administrator. The user will see the UAC consent UI.

More details on what happens under the covers is explained through Q/A below.

  1. What is the detection logic and how does PCA know that the program failed to launch a child exe which needs to run as administrator?The detection for this scenario is accomplished through instrumentation at the CreateProcess API to detect the cases when a child process launch fails due to the requirement to run as administrator.
  2. Why are there no options in this PCA dialog?Due to the high confidence on the issue detection in this scenario, the solution (ELEVATECREATEPROCESS compatibility mode) is automatically applied and the user is not given any options.

In my case I believe there was some problem with the Flex Builder debugger trying to launch FireFox, which was also trying to update itself.

I believe Adobe could avoid this problem by marking Flex Builder as UAC-aware. Then Windows will leave it alone.

First steps with Adobe Alchemy: good news and bad

I’m fascinated by Adobe’s Alchemy project, which compiles C and C++ code into ActionScript, and stayed up late last night to give it a try. I used Ubuntu Linux, which felt brave since the instructions are for Windows (using Cygwin, which enables somewhat Unix-like development) or Mac.

Techie note: It took me a while to get going; like a number of others I ran into a configuration problem, the symptom being that you type alc-on, which is meant to enable the special Alchemy version of GCC (the standard open source C compiler), but instead get “Command not found”. In the end I ran the command:

source $ALCHEMY_HOME/alchemy-setup

directly from the terminal instead of in my bash profile, and that fixed it. The command alc-on is actually an alias created by this setup script.

After that it was relatively plain sailing. I used a recent version of the Flex 3 SDK, and adapted the stringecho.c example to create a countprimes function, because I was curious to see how it would compare to other results.

The result of my efforts was a Flex library called primetest.swc. I copied this to my Windows box, where I have Flex Builder installed. I configured Flex Builder to use Flash Player 10 and the Flex 3.2 SDK. Then I modified the prime-counting applet so I could compare the Alchemy version of the function with ActionScript. Once you have the .swc, the Alchemy code is easy to use:

import cmodule.primetest.CLibInit;

//code omitted

if (chkAlchemy.selected)
{
    var loader:CLibInit = new CLibInit;
    var lib:Object = loader.init();
    numprimes = lib.countprimes(n);
}
else
{
    numprimes = countprimes_as(n);
}

Then I tried the application. Note this is not using the debug player.

As you can see, the Alchemy code is slightly slower than ActionScript. I also tried this with a higher value (10,000,000) and got 34.95 secs for Alchemy versus 32.59 secs for ActionScript.

Conclusions? First, despite the slower performance, Alchemy is mighty impressive. Let’s not forget the intent of Alchemy to enable reuse of C and C++ libraries; it is not just about speed.

Second, note that Alchemy is still a research project and may get faster. Further, I may have missed some tricks in my code.

Third, note that this sort of tight loop is ideal for just-in-time compilation and really should run at speeds close to that of native code anyway. In Silverlight or Java it does.

So does the test prove anything? Well, it shows that Alchemy won’t always speed your code, which raises the question: in what circumstances will it be a performance win? There is a useful post here from Joe Steele:

The potential wins on performance are if you are operating on a large ByteArray (because of the optimized ByteArray opcodes) or if your code is doing a lot of boxing/unboxing (which largely goes away with Alchemys arg passing model). You are ultimately getting ActionScript out so if you already have hand-tuned ActionScript that avoids boxing/unboxing and is dealing with small chunks of data, you are not likely to beat that with Alchemy.

My experience is in line with his comments.

The general claim made in this thread (see Brandan Hall’s post) is that:

… under favorable conditions [Alchemy] runs 10 times slower than native code but 10 times faster than ActionScript

I guess the big question, from a performance perspective, is how much real-world code that is worth optimizing (because it does intensive processing that the user notices) falls into the “favorable conditions” category. If that turns out to be not many, we may have to abandon the idea of Alchemy as a performance fix, and just enjoy the portability it enables.

Former eBay scientist complains of “Dilbertian compromises”

Former eBay scientist Raghav Gupta – who composed a farewell poem – has given an interview in which he talks about the difficulty of getting innovations deployed at eBay:

There is actually a lot of good innovation happening nowadays in terms of demos and prototypes and contests, but hardly anything worthwhile ever makes it out. The personal cost of having to push something down the approval and implementation pipeline is so great that very few are able to persevere. And whatever does get out usually suffers through so many Dilbertian compromises that it is missing the core aspect of the original idea.

You should never judge a company on the basis of comments from departed employees (though Gupta left of his own volition). That said, there is plenty of evidence that eBay is pursuing a policy that is punishing sellers (that is, its customers) in favour of buyers, as well as increasing its fees. The bizarre ratings policy, in which sellers rated as good by buyers get suspended, is one example. Good sellers get suspended for other reasons too, apparently, and I have learned a new bit of jargon as a result – “dolphin”:

Dolphins are those sellers that get suspended by eBay because their system automatically singles them out for skirting the rules or something more grievous, yet if a human actually looked at the sellers account they would find it was a simple mistake and now they are screwed.

according to Randy Smythe, who is chronicling Dolphin stories on his blog.

Another thing I’ve noticed is that eBay’s default search results (which you cannot change globally) are now ordered in what it calls “Best Match”, according to an unknown algorithm, rather than the old system of “Time: Ending soonest”. This change gives eBay the ability to promote sellers it likes at the expense of sellers it does not like – though I do not know whether it uses Best Match in this way. If your items are always several pages away according to “Best match”, it is unlikely you will make many sales. Sellers are now having to discuss “Best Match” optimization techniques similar to Search Engine Optimization on the wider web.

Plenty of frustration for small sellers, then, and the trend at eBay seems to be towards fixed-price sales from large vendors, though of course you can still grab an auction bargain on occasion.

Gupta suggests that sellers get together and invent an alternative eBay on a mutual ownership basis. A nice idea, though eBay is dominant and it won’t be easy to dent its market share. Most disgruntled sellers seem to head for Amazon marketplace.

Technorati tags: , ,

Top ten sites where developers hang out

Someone asked me today where developers hang out on the Net. Excluding platform-specific sites like MSDN, here’s my first go at a top ten list, in alphabetical order – but I’d love to hear other suggestions.

 

Coding Horror Jeff Attwood’s site. Both Attwood and Spolsky are on the Stack Overflow team

DzoneDIGG for developers, also includes Javalobby, which was set up on a wave of indignation back in the nineties when Microsoft seemed to be endangering Java’s run-anywhere promise, now a general Java discussion site.

Joel on Software I don’t always agree with Joel Spolsky, but he’s an excellent writer and deserves inclusion if only for his piece on leaky abstractions.

Reddit programming – another DIGG for developers, actually I prefer this one to Dzone

Slashdot – Geek news discussion site, not just for developers though you’ll find plenty of them there. If you need to know the meaning of the site name, you are outside the target readership.

Stack Overflow – community question and answer site for developers, mentioned by everyone when I asked about developers’ favourite sites on Twitter.

The Code Project – masses of code samples, mainly .NET but also Java, SQL, Linux

The Daily WTF – where WTF stands for “Worse than failure”, allegedly

The Register – IT news site willing to “Bite the hand that feeds IT”, includes a developer section. Disclaimer: I am a contributor.

TheServerSide which has mainly Java but also a .NET site, both geared to enterprise development. Seems less busy of late.

 

Other developer haunts I considered:

DaniWeb – IT discussion community with an active software development forum

DeveloperFusion – based in the UK, developer-focused articles, blogs and resources

Dr Dobbs – web site for the best software development magazine, docked several points for ad videos that auto-play when you visit

Rick Strahl’s web log – where you go to discover what really works or does not work in ASP.NET

SDTimes – web site for a family of Software development magazines, some good articles but not really a community

 

Update: Other sites I probably should have included:

InfoQ – developer-focused articles, Agile/Enterprise focus, sponsor the excellent QCon conferences

Why Audio Matters: Greg Calbi and others on “this age of bad sound”

The Philoctetes center in New York, which is dedicated to “The multidisciplinary study of imagination”, hosted a round table entitled “Deep Listening: Why Audio Quality Matters”, which you view online or download – 2 and a half hours long. It is moderated by Greg Calbi, mastering engineer from Sterling Sound. Other participants are Steve Berkowitz from Sony Music, audio critic and vinyl enthusiast Michael Fremer, academic and audio enthusiast Evan Cornog representing the listening public, audio engineer Kevin Killen, and record producer Craig Street. A good panel, responsible for sounds from a glittering array of artist over many years.

Here’s how Calbi introduces it:

The senses hold mysteries for us ranging from the sacred to the profane … this seminar … is about the beauty of sound and the alienating and off-putting effects of this age of bad sound, and about a commercial infrastructure which seems to be ignoring the potential which great sound holds to trigger our emotions and energy

More and more people especially young people are missing the opportunity to hear really high definition audio. Distorted live concerts, which I’m sure we’ve all been to, MP3s on iPods with earbuds, over-simplified sound for mini-systems, musical junk food.

Listening the modern way doesn’t mean that musical experiences won’t be meaningful and wonderous, but the door to their imaginations can be opened wider. We must promote within our business a product which is state of the art, truly satisfying, geared to maximum sonic quality, not just portability, convenience and ease of purchase, which is where our entire business has been going for the last 5 to 10 years.

If you care about such things, it is a fascinating debate – heated at times as participants disagree about the merits of MP3, CD, SACD and vinyl – but agreed that overabundance of mid-fi portable music has left the high end all-but abandoned.

Unfortunately absent from the round table is any representative of the teenagers and twenty-somethings who historically have been the primary audience for popular music. Too busy out there enjoying it I guess.

There is something absurd about sitting in on audio comparisons via an extreme lossy compressed downloaded or streamed video, but never mind.

I’m reluctant to take the side of the high-end audio industry, because while there is no shortage of good products out there, the industry is also replete with bad science and bad advice, absurdly expensive cables, and poor technical advice in many retailers (in my experience). On the other hand, I’m inclined to agree that sound quality has taken as many steps back as forward in the past thirty years. It is not only the loudness wars (which gets good coverage here around 2 hours in); it is a general problem that producers and engineers do too much processing because they can. Here’s Calbi:

Kevin would be able to detail what goes into the average album now, and how many stages, how many digital conversions, and how many mults, and how many combinations of things – by the time you actually get it, you wouldn’t believe it, we wouldn’t have time to talk about, we’d have to have a weekend.

The outcome is music that may sound good, but rarely sounds great, because it is too far away from what was performed. Auto-tune anyone? Killen describes how he worked with Elvis Costello on a song where Costello’s voice cracks slightly on one of the high notes. They wanted to re-do that note; they fixed it, but returned to the original because it conveyed more emotion. Few are so wise.

Similar factors account for why old CDs and even vinyl sometimes sound better than remastered versions, when matched for volume.

Another part of the discussion is the merits of different musical formats. There is almost a consensus that standard “Red Book” 16/44 CD is inadequate, and that high-res digital like SACD or even, according to some, vinyl records are needed to get the best sound. I’m still not convinced; a study quoted in the September 2007 issue of Journal of the Audio Engineering Society performed hundreds of blind tests and concluded that passing an audio signal through 16-bit/44.1-kHz A/D/A conversion made no audible difference. Someone could still say it was a bad test for this or that reason.

Craig Street in this Philoctetes round table suggests that high-res digital sounds better because we actually take in high frequencies that CD cannot reproduce though our bodies as well as our ears. Count me deeply sceptical – it might be a reason why live performances sound better than recordings, but surely those super-high frequencies would not be reproduced by normal loudspeakers anyway, even if they made it that far through the chain.

Still, while the reasons may be as much to do with mastering choices as inherent technical superiority, individual SACD and DVD Audio discs often do sound better than CD equivalents, and of course offer surround sound as well as stereo.

However, the market for these is uncertain at best. Cornog says:

I feel like a sucker. I’ve got the SACD player, and it sounds great, and I can’t buy it any more. It’s dying.

Before we get too carried away with the search for natural sound, a thought-provoking comment from Berkowitz:

The Beatles didn’t ever play Sergeant Pepper. It only existed in the studio.

He calls it “compositional recording”; and of course it is even more prevalent today.

I was also interested by what Berkowitz says about remastering classic albums from pre-digital years. He says that the engineers should strive to reproduce the sound of the vinyl, because that was the original document – not the tape, which the artist and producer worked on to create the LP. It is a fair point, especially with things with Sixties singles, which I understand were made to sound quite different during the cutting process. It was the single that became a hit, not the tape, which is one reason why those CD compilations never sound quite right, for those who can remember the originals.

Technorati tags:

Microsoft’s three ways to store stuff online

Windows Live Sync has just been released. This is an update to FolderShare, a file synchronization service which Microsoft acquired from Byte Taxi in 2005. The new version supports up to 20 folders and 20,000 files, supports Unicode, and still works on both Windows and Mac. It has also been migrated to Windows Live ID, breaking existing shared access lists and apparently causing much pain for existing users, especially on Macs, judging by the comments here.

But why does Live Sync exist? Most of what it does is a subset of what is available in Live Mesh, which as I understand it is a strategic product. I prefer Mesh, since synchronized files are also stored on the Internet for browser access. Another flaw in Live Sync is that if you enable remote access to folders on your PC that are not sync folders, the files are transferred without encryption.

Finally, let’s not forget SkyDrive, free online storage of up to 25GB. I use this regularly; it works nicely when you just want to stick some files (especially large ones) on the web without worrying about the complexities of synchronization.

Why doesn’t Microsoft integrate SkyDrive with the Mesh online desktop as an online-only storage area, and scrap Live Sync completely – sorry, “roll its functionality into Live Mesh”, in PR-speak?

BBC looking at OpenID for iPlayer social network

At Adobe’s MAX conference in Milan last week I spoke to the BBC’s Anthony Rose, who runs iPlayer at the BBC, and wrote this up for today’s Guardian. One of the things we discussed is social networking planned for iPlayer, where you will be able to see comments, ratings and recommendations from your friends. I asked Rose how user identities will be managed:

“We’ll make sure you never have to log in to use our services. But if you want to post comments and create a profile then you’ll need to log in. We’re going to start by using a BBC one, then we’re going to look at OpenID and see if we can synch to others. OpenID is very cool but is a challenging user experience, and some people will get it, and some will go, why have you made it more difficult?”

Right now there are multiple competing “networks of friends”: Facebook, MySpace, Microsoft Live Messenger, Twitter and so on. Facebook is trying to extend its reach with Facebook Connect; Google is evangelising OpenSocial which “defines a common API for social applications across multiple websites”, along with an implementation called Friend Connect. It will be interesting to see to what extent the BBC creates yet another social network, and to what extent it hooks into existing ones.