Google Health, Phorm, where next for your private data?

Let’s look at the fundamentals. Is an advertising company an appropriate place for sensitive personal data like health records? That’s easy to answer, no matter how many privacy assurances Google gives. Google is a specialist at mining personal data; and whenever I read its terms and conditions it is almost enough to stop me using its services. So Google Health? No thanks. Google, if you want to do this, split the company.

How about this idea: some of the UK’s largest ISP’s – Carphone Warehouse, BT and Virgin Media – intend to hand over their users Internet history to an advertising company called Phorm. The Reg has more details – read the comments to get fully spooked. Someone has setup a protest site here.

Phorm says it has strong privacy practices that safeguard user data, audited by Ernst and Young [PDF]. Safeguards include:

  • Deleting raw data after 14 days
  • Removing numbers longer than 3 digits
  • Not storing email addresses or IP numbers
  • Not storing form fields (thus no passwords)
  • Identifying users only by a random number
  • Analysing data only for predetermined keywords

Happy now? No. Some of these protections are weak. For example, the AOL search data debacle proved that replacing IP numbers with random identifiers is insufficient protection, because users can be identified solely by their activity. This applies even more strongly to an ISP’s data, which has everything you do on the Internet, not just your search history. Second, it is an opt-out system – it should be opt-in – and the opt-out on offer is weak; it merely stops you seeing the targeted ads, rather than preventing your data being sent to Phorm. Third, the data to be mined includes all your non-encrypted Internet activity, such as reading Google Mail, and not just URLs visited. While Phorm says it won’t read it, any additional use of this data makes it more vulnerable to interception and abuse.

What’s the answer? Change your ISP, of course; but also SSL, which encrypts your Internet traffic. Passwords themselves are inherently bad enough, without making it worse by sending them in plain text; further, we need to learn that anything we read or send in plain text over the Internet has been potentially been intercepted. This 2005 article spells out what that means. My hunch is that it is little better now. If we encrypt all the traffic that matters to us, then we won’t care so much that the ISP is selling it on.

[This post replaces an earlier draft].

Update: More details at the Reg today, complete with diagrams. Performance impact is also a concern.

Technorati tags: , , ,

Microsoft’s Vista Capable campaign: where it all went wrong

A series of remarkable internal emails have been made public as a result of the class action lawsuit against Microsoft for its “Vista capable” marketing campaign in the second half of 2006. In essence, the claim is that many of these PCs were not really Vista-compatible, because they could only run Vista Basic, and not Vista’s distinctive Aero graphics.

This is not just about eye candy. See Microsoft’s Greg Schechter’s explanation of Vista’s Desktop Window Manager, part of Aero:

The primary takeaway for desktop composition:  the way an application gets pixels on the screen has fundamentally changed.

It’s fair to say that missing out on Aero means missing out on a core feature of Vista.

Todd Bishop’s Microsoft blog has more details on the case, including a large PDF document showing internal correspondence from Microsoft and its partners, giving insight into how the Vista Capable campaign evolved.

The problem was that Microsoft allowed machines to carry the “Vista Capable” sticker even if they were not able to run Aero. An email from Microsoft’s Ken Goetsch:

We have removed the technical requirement that a Windows Vista Capable PC contains a Graphics Processor Unit (GPU) that supports the Windows Display Driver Model (WDDM), formerly known as the Longhorn Display Driver Model.

Other correspondence in the PDF shows that many at Microsoft were uneasy with this decision; however it was apparently done to help out Intel. Here’s an internal email from John Kalkman, dated February 26 2007::

In the end, we lowered the requirement to help Intel make their quarterly earnings so they could continue to sell motherboards with 915 graphics embedded. This in turn did two things: 1. Decreased focus of OEMs planning and shipping higher-end graphics for Vista ready programs and 2. Reduced the focus by IHVs to ready great WHQL qualified graphics drivers. We can see this today with Intel’s inability to ship a compelling full featured 945 graphics driver for Windows Vista.

Later he says:

It was a mistake on our part to change the original graphics requirements. This created confusion in the industry on how important the aspect of visual computing would play as a feature set to new Windows Vista upgraders.

Now I know why I have over two hundred comments to my January 2007 post, Vista display driver takes a break. My laptop, a Toshiba Portege M400, has the 945 chipset. I bought it specifically to run Vista, towards the end of 2006; and yes, it has a “Windows Vista Capable” sticker. The early Vista graphics drivers were indeed faulty, though in my case a February 2007 update pretty much fixed the problems. I was lucky it did not have a 915 chipset.

How did all this mess come about? The heart of the problem seems to be the infamous Vista reset in 2004, when a ton of work on Longhorn was scrapped, and work resumed based on the Windows 2003 codebase. This was almost certainly a good decision (or the least-bad one possible); but the consequence was that Vista was very late. Another reason was the huge effort put into Windows XP SP2; and the reason for that was the number of desperate security problems in Windows XP.

So Vista was late, and in consequence was rushed. In addition, PC sales were sagging because XP was old and people were waiting for Vista (or switching to Macs), so Intel had overstock. All the pieces were now in place for a Vista-capable sticker whose meaning was not what most people would expect.

Embarrassing for Microsoft. It is better to be transparent even with bad news like, “Your PC will never run Vista properly”, rather than fudge the issue. The episode also illustrates one of the downsides of working with multiple hardware partners, rather than keeping both hardware and software in-house as Apple does.

Technorati tags: , ,

Silverlight steals the show at Microsoft’s UK Heroes launch

I attended Microsoft’s “Heroes Happen Here” launch in London yesterday, which overlapped the US launch presented by CEO Steve Ballmer. The launch is for Visual Studio 2008, Windows Server 2008, and SQL Server 2008, though these products are in varying degrees of readiness.

The event was marred by excessive reliance on buzzwords like “Dynamic IT” – someone should tell Microsoft that phrases like this, or “People Ready” which was used for the Vista launch, have no meaning. Dr Andrew Hopkirk from UK’s National Computing Centre enthused about the general benefits of virtualization, which led to a comical moment later. I asked one of Hopkirk’s colleagues what the NCC thought about Microsoft’s Hyper-V or other virtualization technologies. “Oh, we haven’t evaluated it,” he said. “Most people use VMware and they love it”.

I hate to be disloyal, but the US event which was relayed by satellite, and which hardly any of the UK journalists watched, was more up my street. Ballmer didn’t shout too much, and I liked the drilldowns into specific features of the three products.

Still, after several dry presentations the UK event brightened up when Paul Curtis from EasyJet, a UK budget airline, showed us a proof-of-concept Silverlight application which the company plans to implement on its web site towards the end of this year. We saw an attractive Rich Internet Application which was a mash-up of flight routes and fares, Microsoft Virtual Earth, and reviews from TripAdvisor. Here’s a blurry snap of how you might book a hotel in Barcelona. It’s a compelling visual UI which of course reminded me of similar things I’ve seen implemented with Adobe’s Flash and Flex. Behind the scenes the app will use Server 2008, IIS 7.0, and a SQL Server 2008 Data Warehouse, so this is the perfect case study.

I wanted to ask Curtis whether he was happy with Silverlight’s cross-platform capabilities, and why he was using Silverlight in preference to Flash. However, his bio states that he is a member of the Windows Live Special Interest Group and on the Microsoft Architect Council, so I suspect the answer would be, “it’s what we know.” It does support my impression that despite the rise of Flash, there is still a place for Silverlight within the large Microsoft platform community.

Finally, there was brief mention of high take-up for Microsoft Softgrid, which is described as “application virtualization”. I’ve made this the subject of a separate post.

PS: I met blogger Mark Wilson at the event; he has a more detailed write-up.

Microsoft Softgrid: virtualization for applications

At Microsoft’s Heroes Happen Here launch, I caught up a little with something to which I’ve paid insufficient attention: Microsoft Softgrid, which is described as Application Virtualization. Softgrid is a way of packaging an application and its dependencies into an isolated bundle that runs on the client, but hardly touches the client environment. Each application has its own virtual registery, DLLs, COM DLLs, and even a virtual file system. As a consequence, it “just works”. It also lets you run otherwise incompatible applications side by side. For example, you could have an old Access 97 application, for which the developer left long ago and nobody dares to touch the code, and run it alongside Access 2007. This is apparently a huge hit for Microsoft, which does not surprise me as it solves all sorts of deployment problems. Unfortunately it’s not that easy to get Softgrid: you need to sign up for a thing called the Desktop Optimization Pack for Software Assurance and it is hooked to other components of Microsoft’s enterprise server system:

I would like to see Softgrid’s technology also made available for more general use.

Technorati tags: ,

Tips on digitising vinyl

Seeing announcements for Sony’s new PS-LX300USB USB turntable reminds me what a misunderstood area this is.

I sense that this post may work best in the form of a Q&A, so here goes.

Is digitising vinyl worth the effort?

In many cases, no. Try Amazon or eBay: chances are you can pick up a second-hand CD of the the same music for not much money, often with bonus tracks, or pick and choose from Amazon’s download store or iTunes.

Digitising vinyl is a great deal more work than ripping a CD, and the results may well be less satisfactory. When you rip a CD, it arrives on your hard drive already split into tracks with metadata like title, artist and track name already completed, presuming that the CD is in one of the internet databases like freedb, which it almost always is. That’s not the case with vinyl rips; you have to edit the tracks yourself. Further, CDs don’t usually suffer from crackles, dust, scratches or inner groove distortion, all of which afflict vinyl.

That said, there are a few reasons why you might want to do this:

  1. A CD or legal download is not available
  2. You are time-rich and money poor
  3. You prefer the sound of the vinyl

Point 3 is the most interesting. Sometimes the vinyl was made from better tapes, or mastered better, or features a mix not available on CD. In these loudness war days, the vinyl even of a new release may be better than the CD. Example: Icky Thump by the White Stripes. It has audiophile-quality vinyl mastering, but the usual over-compressed sound on CD.

Finally, sometimes you just want to hear that old sound that you remember. Pure nostalgia, but what’s wrong with that?

Still, if you are digitising vinyl for the sound, then the sound is all important. How then do you get the best results? Read on.

Will a USB turntable make it easy to convert my old records to digital?

Up to a point. A USB turntable is just a turntable with an integrated pre-amplifier and USB connection. It solves a technical problem, by applying RIAA equalization to the output from the phono cartridge, and by including an analogue to digital converter. If you are already set up for playing vinyl, you can achieve the same thing by running a cable from the line-out of your hi-fi to the line-in on your PC or laptop soundcard.

USB turntables will not make your old records any less scratched or dirty. They won’t turn the record over for you at the end of the side. They won’t magically enter the metadata or split the tracks (software might split the tracks for you, but it might not get it exactly right).

So you can guess the answer to the next question:

I already have a turntable. Do I need a USB turntable to digitise my vinyl?

No. In fact, your existing turntable may well be better. You do need a phono pre-amp, but you might have one built into your hi-fi amplifier, or you can get these separately. If you are already able to play vinyl, then you must already have a phono pre-amp somewhere in your set-up.

How do I get the best sound when digitising vinyl?

Not easily. Vinyl is a frustrating medium, because it takes so much effort and expense to get the best out of it. Critical elements include the turntable itself, the arm, the cartridge and stylus, the setup (things like the tracking weight and bias), even the table the turntable sits on makes a difference (wall mounts are good). In other words, the starting point to getting the best sound is the quality of the turntable, and it’s unlikely that a cheap USB turntable bundle is going to deliver anything close to the best sound.

Next comes the phono pre-amp. Since cartridges have low output, the quality of the pre-amp is more critical than it is for CD, especially if you have a moving coil type. Again, it’s unlikely that the pre-amp in a USB turntable is up with the best.

Third, there is the analogue to digital converter, or ADC. Same story: critical component.

So if you truly want the best sound, you need an excellent turntable assembly, an excellent phono pre-amp, and an excellent ADC, preferably external to your computer. Not a USB turntable.

I don’t mean to dismiss the USB turntable idea completely. I’d expect fair results, and if you just want to recapture a memory or two, or hear the record granddad made of his trumpet playing, it will do the job nicely.

What is audio restoration and do I need it?

Once you’ve digitised the audio signal, you end up with a sound file that faithfully captures all the hiss and crackle of the original. What you want is just the music, without the hiss and crackle. Most audio processing software, including Audacity and Sony’s Sound Forge (which comes bundled with its USB turntables), have audio restoration filters that aim to do this.

Two big problems. One is that these filters cannot eliminate problems like hiss and crackle, but only reduce them. The second is that all these filters compromise the sound. The filters are smart, and aim to damage the music as little as possible, but some audible side-effects are pretty much inevitable.

Further, once you have your digital sound file you can try your hand at applying other effects to improve it. Normalization is a good one: it brings the volume up to an appropriate level, which is useful if you recorded the vinyl at too low a level. Then again, it is better to get the level right in the first place. By all means apply other effects as well, but bear in mind that it is very easy to make the music sound less good than it did before.

My tip: the best audio restoration filter is none. They often do more harm than good. They are most useful when the original is really poor.

If you really have time on your hands, manual restoration can be very effective. You can zoom in on an annoying scratch and carefully edit just that short section, avoiding damage to the entire piece. Snag: it’s skilled work and takes ages.

A few closing comments

It can be worth it. Some early pressings of classic albums sound astonishingly good on vinyl.

I’d also like to put in a good word for Sony’s Sound Forge Audio Studio. It’s modestly priced, easy to use, fast and feature-rich. Audio Studio is a cut-down version of a professional tool, Sound Forge 9, which is also highly regarded. The main omission in the budget edition is multi-channel support; not a problem if you are digitising vinyl.

If you are in Europe, online retailer Needles & Spins has some handy “digitise your vinyl” bundles.

PS: Apparently some CD metatag databases recognize “needledrops”, if you rip a CD that was sourced from vinyl. This is done by looking at the track lengths. It is still going to be less comprehensive and reliable than a CD rip.

Technorati tags: ,

Trolltech says Qt for Windows CE coming in May

Trolltech has announced Windows CE support in the 4.4 release of Qt, its cross-platform development framework. A pre-release is already available. Qt already supports desktop Windows, Linux, and Mac OS X, so this plugs a significant gap. Features include SVG (Scalable Vector Graphics) and OpenGL. It’s good to see this going ahead  despite Nokia’s acquisition of Trolltech, which is set to be completed in the second quarter of 2008. Nokia is committed to a couple of rival embedded operating systems, Linux and Symbian.

What about Qt for Symbian then? There are hints that it will happen. Then again, perhaps Nokia will increase its focus on Linux?

Technorati tags: , , ,

Chris Anderson on Freeconomics

Chris Anderson, the Wired editor who coined the term “long tail”, has written a lengthy piece on Free – why $0.00 is the future of business, and is writing a book on the subject.

It caught my eye in part because of what Sun’s Jonathan Schwartz told me the other day: the only acceptable price is free.

The idea must terrify Microsoft, which makes most of its money from software licenses, while letting third-parties take the profits from custom development and services. Companies are less vulnerable if they sell both hardware and software, or have strong services departments.

The paradox here is that even when the marginal cost drops close to zero, there still has to be a business model. Something I am still trying to figure out as I give away content on this blog.

Adobe AIR now available; not just consumer fluff

Adobe has released AIR and you can download the runtime and SDK now, as well as FlexBuilder 3, the official IDE for AIR. Just to remind you, AIR is a way of running Flash applications on the desktop, supplemented by SQLite, a fast local database manager.

Among the most interesting case studies I’ve seen is from LMG, which runs loyalty schemes including the Nectar card and Air Miles. The big deal for the retailers is that using your loyalty card lets them identify who is buying what, providing mountains of data which can be mined for trends and the like. I do mean mountains. Nectar is used by Sainsburys. Between 25 and 40 million “basket items” are added to the database each day, and the database holds 2 years of data.

LMG’s Self-serve is an app in development which enables Sainsburys and its suppliers to analyze this data; it could potentially be used by other retailers too. “The application answers questions like how’s my brand performing, who’s buying my brand, what else are they buying,” says Garth Ralston, LMG’s Business Intelligence Development Manager.

Self-Serve is built with AIR and Flex. “Excel spreadsheets, which some of our competitors use, and the pie charts than you can create within them, are so 1990’s”, says Ralston. “We’re looking for a little bit more of the Wow factor.”

A couple of things particularly interested me. One is that SQLite is critical for the app, which works by downloading large chunks of data and manipulating it on the client. This means that Self-Serve would not work as a browser application, unless possibly with Google Gears, which also uses SQLLite. Another is the importance of offline working. “The ability to have a user run the app, run a report, download the data to their system, take the laptop on the train and continue to work is an absolute business requirement”, says Ralston.

James Governor, Redmonk analyst, told the press that BMC will be using AIR as a front-end to integrate its mainframe management offerings, and SAP will be using it. “Frankly, I think this will be the front-end for all SAP business applications,” he said. In other words, AIR is not just consumer fluff.

Governor is just back from Sun, as I am, and while I was there I picked up some anxiety at the way Flash and now AIR are doing what Java was intended to do – provide a rich cross-platform client. Has Adobe stolen Sun’s market? “Sun is quite capable of stealing its own market”, he said. “Java just hasn’t delivered the kind of rich desktop experiences that we would expect and hope.” That said, note that FlexBuilder is a Java application, Adobe’s server-side LiveCycle data services are Java, and Adobe’s ColdFusion runs on Java, so there are pros and cons here for Sun’s technology.

Actually, I suspect you could build Self-Serve in Java without much difficulty. The big win for AIR is that it’s home territory for multitudes of Flash designers. This is as much about designer and developer communities as it is about technology. The same applies to Microsoft’s Silverlight, which is ideal for Visual Studio developers to whom Flash is foreign.

I still have reservations about AIR, though there is also much to like. It’s early days of course; I’m looking forward to trying it for real. I also love the way these new initiatives are making us rethink the design of essential applications that have remained essentially unchanged for years.

Fixing Windows Media Player after a system upgrade

A while back I upgraded my motherboard. Windows Media Player seemed fine – in fact, it works quite a bit better with the faster CPU – until today, when it started crashing shortly after starting. The faulting module was Indiv01.key.

The solution is in this thread. On Vista, what you have to do is to delete the contents of the folder C:\ProgramData\Microsoft\Windows\DRM (not the folder itself). Note that this folder is invisible by default. In Explorer – Folder Options – View, you have to check Show hidden files and folders, and uncheck Hide protected operating system files.

Observe the caveat:

Note that anything recorded on the old system that is DRM protected will not be playable after this procedure.

I recall doing something similar to get BBC iPlayer (download version) working.

This is all to do with tying DRM to hardware. You are not meant to copy a protected file to another PC and still be able to play it. There used to be a method for backing up and restoring your licenses, but it seems to have gone in Vista. From online help:

This version of the Player does not permit you to back up your media usage rights. However, depending upon where your protected files came from, you might be able to restore your rights over the Internet. For more information, see the question in this topic about how to restore your media usage rights.

This leaves a few questions for Microsoft to consider:

  • Why does a DRM problem break Windows Media Player even when playing non-DRM content?
  • Why does a DRM problem cause Windows Media Player to crash, rather than reporting a DRM problem?
  • Why does the user have to uncheck a box in Explorer options that says “Recommended” and warns you that you may make your computer inoperable, in order to fix a common problem? I mean “Hide protected operating system files”?
  • Is it acceptable to say, “you might be able to restore your rights”, when a user could in theory have thousands of pounds invested in DRM-protected content?

Fortunately I don’t have any DRM-protected content that I am aware of.

Everything is fine now.

How long should it take to set up a laptop?

So you need a new laptop. Ignoring those irritating voices that say you should go Apple, you select a value-for-money offering from one of the big names like Toshiba or HP, hit the buy button at Ebuyer or the like, and a day or so later a van is at the door and you have your shiny new laptop. You slit the tape, pull the thing out of the box, plug it in and turn it on. How long should it take before you are happily typing away in Word or enjoying a DVD?

The answer I guess is as short a time as possible. In principle, I don’t see why it should take more than 5 or 10 minutes. The manufacturer has pre-installed the operating system and can ensure that all the right drivers are in place.

Here’s what actually happened when I did this for a friend yesterday. Toshiba Satellite Pro A200 with Vista Business. Not a bad machine, great value. We also had a key to activate Office 2007, which came pre-installed as part of Microsoft’s Office Ready scheme.

I started mid-morning. Turned on. It takes ages before it lets you in. I lost count of the reboots. There is some sort of partitioning dance, then when Vista itself starts up it goes through an optimisation process, then various Toshiba and third-party utilities install themselves, sometimes requiring a reboot. I broke for lunch.

After lunch I connected to the Internet. Vista immediately set about downloading updates. Needed reboots, naturally. Then I ran the Office Activation Wizard. Microsoft’s Office-Ready program is great marketing, but fairly annoying, because typically you don’t want to purchase all of it. In our case we had purchased Office Small Business, but not Access. In consequence, you end up with an installation that is partially a trial version, even though you have paid. I’ve heard of this scenario actually preventing a machine from passing “Genuine Office Validation” when trying to download updates from Microsoft. Not a good way to treat customers. The solution is to uninstall the bits of Office you are not actually buying.

At this point I could have declared “job done”, but I knew that it wasn’t. I applied Vista SP1, which takes ages. I applied Office 2007 SP1, which is fairly quick. I removed a few things that I knew would not be needed, like Outlook’s Business Contact Manager.

I uninstalled Toshiba’s ConfigFree utility. This is a thing that is meant to “simplify” managing wireless (and wired) networks. It hijacks Vista’s perfectly good built-in wireless configuration utility. Now, it is possible that ConfigFree genuinely offers some added value, but even if it does this kind of thing is still a nuisance. First, because people like myself know how the Windows version works, and are disinclined to learn the foibles of an unnecessary replacement. Second, because the official item will be maintained and updated through Windows update, rather than at the whim of Toshiba (or whomever).

If you are really unlucky, the supplier of your wireless card, or wireless router, or your ISP, will persuade you to install yet more network configuration software. Once two or three of these guys are fighting to manage and diagnose your wireless connection, you have little chance of connecting successfully to anything.

There there is anti-virus to think about. Personally I reckon the practice of installing trial versions of Norton’s anti-virus suite (or similar) is a disgrace. It makes for a lousy user experience because the first thing you see after enduring setup is a nag screen assuring you that your new computer is insecure. It is a disgrace because if you accept the trial but don’t pay up, you end up with an out-of-date anti-virus utility, which leaves you vulnerable. Let’s not forget that basic anti-virus software is available for free from AVG and a few others – if Toshiba really cared about the security of its customers, it would pre-install that. I have zero confidence in anti-virus software anyway, but this is not the place.

Result overall: three to four hours spent on something that should take a few minutes.

I have a good understanding of the commercial, technical and political reasons for these hassles, and I don’t regard Toshiba as the worst offender. Nevertheless, Microsoft and its partners have failed to tackle the problem effectively, and this is a factor behind Apple’s resurgence. Frankly, Ubuntu and other Linux distros are more fun to install, though with Linux you inevitably end up Googling to solve one or more strange issues so overall it is no better for the non-technical user.

Recently I’ve been working with Windows Server 2008, which is a delight by comparison. The concept is simple: pre-install the bare bones, and make all the features optional. So Microsoft can do it. Why can’t consumer Windows work the same way? Install a clean, fast, basic version of Windows, and then let the user decide what else they require?