All posts by onlyconnect

BBC iPlayer AIR app brings downloads to Mac and Linux

I’ve successfully installed the new BBC iPlayer AIR application on Windows, Mac and Linux – and I’m mostly impressed so far. The main snag is that you have to click the Labs tester button on a separate page before the  download works – but this isn’t mentioned on the download page. Another usability issue is that when you start up the app it invites you start downloading; you click the link, and the iPlayer web site opens in your default browser with no advice on what to do next. You have to find a programme which includes a download to computer link – most of them do not. I found a Roy Orbison documentary that worked (no, that’s not Roy Orbison in the pic, but another singer).

This was a better experience than early days with the old download iPlayer, though on Linux (Ubuntu Intrepid Ibex) I found that I needed to fiddle with the settings and allocate some disk space specifically before it would accept downloads.

An interesting aspect of the new iPlayer is that it replaces a peer-to-peer download system with a direct download. I discussed the implications of this at some length with both Anthony Rose at the BBC, and with a couple of ISPs, when I was researching an interview for the Guardian. In the end there wasn’t enough space to include much of this technical detail, though I’m hoping to post some of it in the near future.

A quick summary: the ISPs are not in favour of peer-to-peer because it is less efficient. Typically, all the retries cause approximately double the amount of data to be transferred (according to my source). That said, they don’t like the BBCs move towards Level 3 rather than Akamai, because it works out more expensive for them. ISPs could install their own box to stream the BBCs content, saving them operational money, but these apparently are expensive to buy and install; I was told that the iPlayer’s traffic does not yet justify it, but if it grows to say twice what it is now, it will become economic.

The biggest cost though is the last step, from the ISP to the user. This is where the cable companies (mostly Virgin Media) have a big advantage, since the cable goes to your doorstep, and is designed to accommodate digital broardcasts. ISPs that have taken advantage of local loop unbundling are also relatively well placed. Those that pay BT wholesale for the traffic are the most vulnerable.

The other important point is that there is always something you can do to manage increased traffic – though not necessarily quickly. If everyone in the UK suddenly tries to watch HD video at the same time, the system will seize up, but that won’t happen. What will happen is that increasing numbers of people will find that their cheap transfer-limited packages are no longer sufficient and they will need to upgrade.

Technorati tags: , , , , , ,

Vista’s mysterious compatibility settings: what do they do?

I hate this Program compatibility Assistant in Vista. Why?

First, because it applies settings whether you like it or not. There’s no option to say, “I was happy with how it ran, just leave it alone”.

Second, because it does not tell you what it has done. Sure, there is a link that says, What settings are applied? So you click it.

And you get a generic help dialog with six headings. You click the most promising: What changes does it make? It says:

It depends on the problem, but any changes made are related to how Windows runs the program. No changes are made to the program itself. For example, the Program Compatibility Assistant can resolve conflicts with User Account Control, a new security feature in this version of Windows that can help make your computer safer. Or, it can run the program in a mode that simulates earlier versions of Windows. The changes that Program Compatibility Assistant makes are done automatically, so you don’t need to make them.

Vague and uninformative. And that’s it.

So how do you really discover? Well, you could read this article. Then you can fire up RegEdit and look at:

Software\Microsoft\Windows NT\CurrentVersion\AppCompatFlags\Layers

under both HKEY_CURRENT_USER and HKEY_LOCAL_MACHINE. Here I found an entry for FlexBuilder.exe as follows:

OK, so Flex Builder will now have ELEVATECREATEPROCESS applied. What does that mean? Here’s the scoop:

Here the test program was trying to launch an updater which is required to run as administrator and failed. In this case, PCA will apply the ELEVATECREATEPROCESS compatibility mode, which will enable the program to successfully launch the child exe as administrator the next time. Now when the program is run the next time and while trying to launch the updater, it will not fail and will successfully run as administrator. The user will see the UAC consent UI.

More details on what happens under the covers is explained through Q/A below.

  1. What is the detection logic and how does PCA know that the program failed to launch a child exe which needs to run as administrator?The detection for this scenario is accomplished through instrumentation at the CreateProcess API to detect the cases when a child process launch fails due to the requirement to run as administrator.
  2. Why are there no options in this PCA dialog?Due to the high confidence on the issue detection in this scenario, the solution (ELEVATECREATEPROCESS compatibility mode) is automatically applied and the user is not given any options.

In my case I believe there was some problem with the Flex Builder debugger trying to launch FireFox, which was also trying to update itself.

I believe Adobe could avoid this problem by marking Flex Builder as UAC-aware. Then Windows will leave it alone.

First steps with Adobe Alchemy: good news and bad

I’m fascinated by Adobe’s Alchemy project, which compiles C and C++ code into ActionScript, and stayed up late last night to give it a try. I used Ubuntu Linux, which felt brave since the instructions are for Windows (using Cygwin, which enables somewhat Unix-like development) or Mac.

Techie note: It took me a while to get going; like a number of others I ran into a configuration problem, the symptom being that you type alc-on, which is meant to enable the special Alchemy version of GCC (the standard open source C compiler), but instead get “Command not found”. In the end I ran the command:

source $ALCHEMY_HOME/alchemy-setup

directly from the terminal instead of in my bash profile, and that fixed it. The command alc-on is actually an alias created by this setup script.

After that it was relatively plain sailing. I used a recent version of the Flex 3 SDK, and adapted the stringecho.c example to create a countprimes function, because I was curious to see how it would compare to other results.

The result of my efforts was a Flex library called primetest.swc. I copied this to my Windows box, where I have Flex Builder installed. I configured Flex Builder to use Flash Player 10 and the Flex 3.2 SDK. Then I modified the prime-counting applet so I could compare the Alchemy version of the function with ActionScript. Once you have the .swc, the Alchemy code is easy to use:

import cmodule.primetest.CLibInit;

//code omitted

if (chkAlchemy.selected)
{
    var loader:CLibInit = new CLibInit;
    var lib:Object = loader.init();
    numprimes = lib.countprimes(n);
}
else
{
    numprimes = countprimes_as(n);
}

Then I tried the application. Note this is not using the debug player.

As you can see, the Alchemy code is slightly slower than ActionScript. I also tried this with a higher value (10,000,000) and got 34.95 secs for Alchemy versus 32.59 secs for ActionScript.

Conclusions? First, despite the slower performance, Alchemy is mighty impressive. Let’s not forget the intent of Alchemy to enable reuse of C and C++ libraries; it is not just about speed.

Second, note that Alchemy is still a research project and may get faster. Further, I may have missed some tricks in my code.

Third, note that this sort of tight loop is ideal for just-in-time compilation and really should run at speeds close to that of native code anyway. In Silverlight or Java it does.

So does the test prove anything? Well, it shows that Alchemy won’t always speed your code, which raises the question: in what circumstances will it be a performance win? There is a useful post here from Joe Steele:

The potential wins on performance are if you are operating on a large ByteArray (because of the optimized ByteArray opcodes) or if your code is doing a lot of boxing/unboxing (which largely goes away with Alchemys arg passing model). You are ultimately getting ActionScript out so if you already have hand-tuned ActionScript that avoids boxing/unboxing and is dealing with small chunks of data, you are not likely to beat that with Alchemy.

My experience is in line with his comments.

The general claim made in this thread (see Brandan Hall’s post) is that:

… under favorable conditions [Alchemy] runs 10 times slower than native code but 10 times faster than ActionScript

I guess the big question, from a performance perspective, is how much real-world code that is worth optimizing (because it does intensive processing that the user notices) falls into the “favorable conditions” category. If that turns out to be not many, we may have to abandon the idea of Alchemy as a performance fix, and just enjoy the portability it enables.

Former eBay scientist complains of “Dilbertian compromises”

Former eBay scientist Raghav Gupta – who composed a farewell poem – has given an interview in which he talks about the difficulty of getting innovations deployed at eBay:

There is actually a lot of good innovation happening nowadays in terms of demos and prototypes and contests, but hardly anything worthwhile ever makes it out. The personal cost of having to push something down the approval and implementation pipeline is so great that very few are able to persevere. And whatever does get out usually suffers through so many Dilbertian compromises that it is missing the core aspect of the original idea.

You should never judge a company on the basis of comments from departed employees (though Gupta left of his own volition). That said, there is plenty of evidence that eBay is pursuing a policy that is punishing sellers (that is, its customers) in favour of buyers, as well as increasing its fees. The bizarre ratings policy, in which sellers rated as good by buyers get suspended, is one example. Good sellers get suspended for other reasons too, apparently, and I have learned a new bit of jargon as a result – “dolphin”:

Dolphins are those sellers that get suspended by eBay because their system automatically singles them out for skirting the rules or something more grievous, yet if a human actually looked at the sellers account they would find it was a simple mistake and now they are screwed.

according to Randy Smythe, who is chronicling Dolphin stories on his blog.

Another thing I’ve noticed is that eBay’s default search results (which you cannot change globally) are now ordered in what it calls “Best Match”, according to an unknown algorithm, rather than the old system of “Time: Ending soonest”. This change gives eBay the ability to promote sellers it likes at the expense of sellers it does not like – though I do not know whether it uses Best Match in this way. If your items are always several pages away according to “Best match”, it is unlikely you will make many sales. Sellers are now having to discuss “Best Match” optimization techniques similar to Search Engine Optimization on the wider web.

Plenty of frustration for small sellers, then, and the trend at eBay seems to be towards fixed-price sales from large vendors, though of course you can still grab an auction bargain on occasion.

Gupta suggests that sellers get together and invent an alternative eBay on a mutual ownership basis. A nice idea, though eBay is dominant and it won’t be easy to dent its market share. Most disgruntled sellers seem to head for Amazon marketplace.

Technorati tags: , ,

Top ten sites where developers hang out

Someone asked me today where developers hang out on the Net. Excluding platform-specific sites like MSDN, here’s my first go at a top ten list, in alphabetical order – but I’d love to hear other suggestions.

 

Coding Horror Jeff Attwood’s site. Both Attwood and Spolsky are on the Stack Overflow team

DzoneDIGG for developers, also includes Javalobby, which was set up on a wave of indignation back in the nineties when Microsoft seemed to be endangering Java’s run-anywhere promise, now a general Java discussion site.

Joel on Software I don’t always agree with Joel Spolsky, but he’s an excellent writer and deserves inclusion if only for his piece on leaky abstractions.

Reddit programming – another DIGG for developers, actually I prefer this one to Dzone

Slashdot – Geek news discussion site, not just for developers though you’ll find plenty of them there. If you need to know the meaning of the site name, you are outside the target readership.

Stack Overflow – community question and answer site for developers, mentioned by everyone when I asked about developers’ favourite sites on Twitter.

The Code Project – masses of code samples, mainly .NET but also Java, SQL, Linux

The Daily WTF – where WTF stands for “Worse than failure”, allegedly

The Register – IT news site willing to “Bite the hand that feeds IT”, includes a developer section. Disclaimer: I am a contributor.

TheServerSide which has mainly Java but also a .NET site, both geared to enterprise development. Seems less busy of late.

 

Other developer haunts I considered:

DaniWeb – IT discussion community with an active software development forum

DeveloperFusion – based in the UK, developer-focused articles, blogs and resources

Dr Dobbs – web site for the best software development magazine, docked several points for ad videos that auto-play when you visit

Rick Strahl’s web log – where you go to discover what really works or does not work in ASP.NET

SDTimes – web site for a family of Software development magazines, some good articles but not really a community

 

Update: Other sites I probably should have included:

InfoQ – developer-focused articles, Agile/Enterprise focus, sponsor the excellent QCon conferences

Why Audio Matters: Greg Calbi and others on “this age of bad sound”

The Philoctetes center in New York, which is dedicated to “The multidisciplinary study of imagination”, hosted a round table entitled “Deep Listening: Why Audio Quality Matters”, which you view online or download – 2 and a half hours long. It is moderated by Greg Calbi, mastering engineer from Sterling Sound. Other participants are Steve Berkowitz from Sony Music, audio critic and vinyl enthusiast Michael Fremer, academic and audio enthusiast Evan Cornog representing the listening public, audio engineer Kevin Killen, and record producer Craig Street. A good panel, responsible for sounds from a glittering array of artist over many years.

Here’s how Calbi introduces it:

The senses hold mysteries for us ranging from the sacred to the profane … this seminar … is about the beauty of sound and the alienating and off-putting effects of this age of bad sound, and about a commercial infrastructure which seems to be ignoring the potential which great sound holds to trigger our emotions and energy

More and more people especially young people are missing the opportunity to hear really high definition audio. Distorted live concerts, which I’m sure we’ve all been to, MP3s on iPods with earbuds, over-simplified sound for mini-systems, musical junk food.

Listening the modern way doesn’t mean that musical experiences won’t be meaningful and wonderous, but the door to their imaginations can be opened wider. We must promote within our business a product which is state of the art, truly satisfying, geared to maximum sonic quality, not just portability, convenience and ease of purchase, which is where our entire business has been going for the last 5 to 10 years.

If you care about such things, it is a fascinating debate – heated at times as participants disagree about the merits of MP3, CD, SACD and vinyl – but agreed that overabundance of mid-fi portable music has left the high end all-but abandoned.

Unfortunately absent from the round table is any representative of the teenagers and twenty-somethings who historically have been the primary audience for popular music. Too busy out there enjoying it I guess.

There is something absurd about sitting in on audio comparisons via an extreme lossy compressed downloaded or streamed video, but never mind.

I’m reluctant to take the side of the high-end audio industry, because while there is no shortage of good products out there, the industry is also replete with bad science and bad advice, absurdly expensive cables, and poor technical advice in many retailers (in my experience). On the other hand, I’m inclined to agree that sound quality has taken as many steps back as forward in the past thirty years. It is not only the loudness wars (which gets good coverage here around 2 hours in); it is a general problem that producers and engineers do too much processing because they can. Here’s Calbi:

Kevin would be able to detail what goes into the average album now, and how many stages, how many digital conversions, and how many mults, and how many combinations of things – by the time you actually get it, you wouldn’t believe it, we wouldn’t have time to talk about, we’d have to have a weekend.

The outcome is music that may sound good, but rarely sounds great, because it is too far away from what was performed. Auto-tune anyone? Killen describes how he worked with Elvis Costello on a song where Costello’s voice cracks slightly on one of the high notes. They wanted to re-do that note; they fixed it, but returned to the original because it conveyed more emotion. Few are so wise.

Similar factors account for why old CDs and even vinyl sometimes sound better than remastered versions, when matched for volume.

Another part of the discussion is the merits of different musical formats. There is almost a consensus that standard “Red Book” 16/44 CD is inadequate, and that high-res digital like SACD or even, according to some, vinyl records are needed to get the best sound. I’m still not convinced; a study quoted in the September 2007 issue of Journal of the Audio Engineering Society performed hundreds of blind tests and concluded that passing an audio signal through 16-bit/44.1-kHz A/D/A conversion made no audible difference. Someone could still say it was a bad test for this or that reason.

Craig Street in this Philoctetes round table suggests that high-res digital sounds better because we actually take in high frequencies that CD cannot reproduce though our bodies as well as our ears. Count me deeply sceptical – it might be a reason why live performances sound better than recordings, but surely those super-high frequencies would not be reproduced by normal loudspeakers anyway, even if they made it that far through the chain.

Still, while the reasons may be as much to do with mastering choices as inherent technical superiority, individual SACD and DVD Audio discs often do sound better than CD equivalents, and of course offer surround sound as well as stereo.

However, the market for these is uncertain at best. Cornog says:

I feel like a sucker. I’ve got the SACD player, and it sounds great, and I can’t buy it any more. It’s dying.

Before we get too carried away with the search for natural sound, a thought-provoking comment from Berkowitz:

The Beatles didn’t ever play Sergeant Pepper. It only existed in the studio.

He calls it “compositional recording”; and of course it is even more prevalent today.

I was also interested by what Berkowitz says about remastering classic albums from pre-digital years. He says that the engineers should strive to reproduce the sound of the vinyl, because that was the original document – not the tape, which the artist and producer worked on to create the LP. It is a fair point, especially with things with Sixties singles, which I understand were made to sound quite different during the cutting process. It was the single that became a hit, not the tape, which is one reason why those CD compilations never sound quite right, for those who can remember the originals.

Technorati tags:

Microsoft’s three ways to store stuff online

Windows Live Sync has just been released. This is an update to FolderShare, a file synchronization service which Microsoft acquired from Byte Taxi in 2005. The new version supports up to 20 folders and 20,000 files, supports Unicode, and still works on both Windows and Mac. It has also been migrated to Windows Live ID, breaking existing shared access lists and apparently causing much pain for existing users, especially on Macs, judging by the comments here.

But why does Live Sync exist? Most of what it does is a subset of what is available in Live Mesh, which as I understand it is a strategic product. I prefer Mesh, since synchronized files are also stored on the Internet for browser access. Another flaw in Live Sync is that if you enable remote access to folders on your PC that are not sync folders, the files are transferred without encryption.

Finally, let’s not forget SkyDrive, free online storage of up to 25GB. I use this regularly; it works nicely when you just want to stick some files (especially large ones) on the web without worrying about the complexities of synchronization.

Why doesn’t Microsoft integrate SkyDrive with the Mesh online desktop as an online-only storage area, and scrap Live Sync completely – sorry, “roll its functionality into Live Mesh”, in PR-speak?

BBC looking at OpenID for iPlayer social network

At Adobe’s MAX conference in Milan last week I spoke to the BBC’s Anthony Rose, who runs iPlayer at the BBC, and wrote this up for today’s Guardian. One of the things we discussed is social networking planned for iPlayer, where you will be able to see comments, ratings and recommendations from your friends. I asked Rose how user identities will be managed:

“We’ll make sure you never have to log in to use our services. But if you want to post comments and create a profile then you’ll need to log in. We’re going to start by using a BBC one, then we’re going to look at OpenID and see if we can synch to others. OpenID is very cool but is a challenging user experience, and some people will get it, and some will go, why have you made it more difficult?”

Right now there are multiple competing “networks of friends”: Facebook, MySpace, Microsoft Live Messenger, Twitter and so on. Facebook is trying to extend its reach with Facebook Connect; Google is evangelising OpenSocial which “defines a common API for social applications across multiple websites”, along with an implementation called Friend Connect. It will be interesting to see to what extent the BBC creates yet another social network, and to what extent it hooks into existing ones.

First steps with offline Silverlight and Live Framework

Yesterday I wrote a simple test application for Silverlight running on Live Mesh. It is an interesting scenario, which enables Silverlight applications to run offline, in the style of Adobe AIR. I wrote a to-do list which stores its data in the cloud; I added some items online, and deleted and added some items offline on another machine, and when it reconnected all the edits synchronized. Cool.

Even so, I’m not finding this particularly easy. Note that this is a limited-access Community Tech Preview, so make big allowance for that. Here are some of the problems I’ve run into to:

Convoluted sign-up and web site navigation.

This is a preview controlled by developer tokens. It’s complicated by the fact that a variety of different types of token control access to different parts of Windows Azure. You get a token, think you are done, then discover you have the wrong kind of token and need to apply for another one.

Even when you have the token, navigation is tricky. I’ve wasted time clicking through from one overview page to another; there are even promising links that seem to go back to the page you are looking at. Tip: when you find a useful page like the Developer Portal, for provisioning Azure and Live Framework apps, bookmark it quick.

Inadequate documentation.

OK, I guess this is to be expected in a preview. But I’m finding an over-abundance of overviews and diagrams, too many videos, and a lack of plain-speaking developer-focused documentation that answers obvious questions. Most of the entries in the .NET reference for the Live Framework client look like this:

That’s right, auto-generated docs with no description of what the class member is for or how to use it. You can expand the plus symbols, but it is not rewarding.

Slow going

Microsoft’s online MSDN documentation and forums work, but I never look forward to visiting them because I know they will be slow to navigate and I’ll be sitting waiting for pages to refresh. I’m not sure that Microsoft understands the importance of this point. It works – so why complain? Well, because the cumulative effect over time is to make me want to go elsewhere. Google and Yahoo, by contrast, usually run much more responsive sites.

Microsoft could improve this quite easily. The key: fast is more important than pretty.

Bugs and outages

I certainly expect bugs and outages in a CTP. Still, they are frustrating. My very simple Mesh app did not work at all on a Mac, even running in the browser. In this scenario, you don’t need the Mesh client; it is just a Silverlight application running in a web page. I asked about this on the forum:

I confirm it from the core team. With quite aggressive timeline for PDC release with the current level of resources and CTP goals, we could only test XP SP3/Vista SP1 and IE7. We will eventually support Mac and Safari but this is not an immediate priority in the near future for CTP QFEs.

Fair enough, thought to me cross-platform is at the heart of why I might want to use this technology.

This morning my Silverlight Mesh application won’t run at all. I get a 404. I guess it’s the bleeding edge.

Online/Offline

I’ve also asked a couple of simple questions on the forum about the online/offline scenario. I’m finding that offline applications don’t run unless you are signed into Windows Live. That is, you can sign in and then go offline, and it works, but if you don’t sign in (for example, because you start up your laptop in an aeroplane), then the app does not start. There must be some way to use cached credentials?

My other question is about synchronization. How do you enforce constraints on a Mesh datafeed, given that it could be edited online and offline simultaneously, bypassing checks in your code? This is not quite the same as a conflict, where the same entry is edited by two different people. My example is how to make sure that duplicate items are not added to a list.

Early days

I expect that this Azure stuff will get much smoother in future updates; and bear in mind that what I’m working with here – Mesh, Silverlight, and the .NET Live Framework client library – is just one small corner of the whole.

I can’t help wondering though if Microsoft is being over-ambitious. Another technology I’ve been looking at recently is Adobe AIR. The scope of this is small relative to Live Mesh. You can describe it in a few words: run a Flash application on the desktop. It has limitations and frustrations, but at least it is easy to understand, and furthermore, it has pretty much worked as advertised from the earliest public previews.

By contrast, Mesh for developers feels like a huge thing that is part of an even huger thing (Azure); it has lots of promise, but it is harder to describe its essence in a few words (that are not marketing fluff).

That said, I like Silverlight itself. This piece at least is easy to grasp and works well, in my experience so far.

Working with Live Framework or Azure? I’d love to know how it is going for you.

JavaFX warns against itself on Macs

If you navigate to JavaFX.com on a Mac, you get this warning – at least, I do, and so does at least one other:

In case you can’t read it, it says:

This applet was signed by “JavaFX 1.0 Runtime,” but Java cannot verify the authenticity of the signature’s certificate. Do you trust this certificate? Click Trust to run this applet and allow it unrestricted access to your computer.

I trusted it anyway. Why? Mainly because it is on Sun’s site, and I doubt Sun was hacked. Second, because I clicked Show Certificate and it said everything was fine. Third, because on balance I think it is more likely that either Sun, Apple or a.n.other messed up either the cert or some other aspect of digital security programming, than that this particular bit of code belongs to a bad guy.

Nevertheless, I mention it because it illustrates the continuing hopeless state of Internet security. How on earth am I meant to know whether I should trust a certificate that “Java” has rejected? Who is this Java guy anyway? Why should I give any applet “unrestricted access” to my computer?

I see this all the time. We are confronted with impossible decisions, where one set of training tells us to click No – the certificate is out of date, the application is unsigned, the requested permissions are unwarranted – and another set of training tells us to click Yes – this is a reputable site, I need this installed to get on with my work, I’ve seen dialogs like this before and not come to any harm.

It might be better not to have the choice. In the scenario above, if the applet just refused to run, then there is a better chance that the problem would be treated as a bug and fixed. As it is, there is little chance that we will always guess right.

Technorati tags: , , ,