News: Steven Sinofsky says nothing about Windows 7

I feel for CNET’s Inet Fried, who got an interview with Microsoft’s Steven Sinofsky to talk about Windows 7, but got nothing of substance out of him, even though he is the right person to ask. I quite enjoyed this bit of circumlocution though. Sinofsky is talking about how Microsoft “re-plumbed” the graphics in Vista:

The team worked super hard with the partners in graphics to really do a great job, but the schedule challenges that we had, and the information disclosure weren’t consistent with the realities of the project, which made it all a much trickier end point when we got to the general availability in January.

Who are the “partners in graphics”? Sinofsky is talking about third-party vendors of graphics cards, mostly ATI, NVIDIA and Intel. What is the relevance of “information disclosure”? Sinofsky is talking about how the information delivered by Microsoft to these vendors was insufficiently accurate, complete or consistent for them to create robust drivers in time. What is a “trickier end point”? Well, problems like this driver error I guess – an earlier post which has just clocked up its 244th comment.

So now we are getting a few confessions about Vista, but that does not tell us much about Windows 7; except that there will be less re-plumbing and more high-level changes. Maybe.

If you are still curious about Windows 7, there are always the rumours about Ribbon, Jewel, and the new “markup based UI and a small, high performance, native code runtime” to chew on.

UK Rock Band prices strike an ugly chord

I was surprised to see that the highly-regarded Rock Band game has been rated only a two-star game by customer reviewers at Amazon.co.uk.

The reason: not the game, but the price. The Band in a Box package, which costs up to £129.00, neglects to include the game itself: buy it separately for up to £49.99.

Amazon’s prices are a little less; but in the USA Amazon.com offers the whole lot for $149.99 (list price $169.99). I’m using the Xbox 360 versions for this example.

The problem for Electronic Arts is that buyers are only a click away from checking out the prices in other countries. “The price is a joke. No purchase,” says one UK customer review.

Will it seriously affect sales? Only if people really do refuse to buy; then of course we’ll see the price lowered. However, it is currently occupying positions two and three in Amazon’s all-format bestseller chart, so maybe not.

Aside: why does Amazon say this has a PEGI rating of 16 for the instruments, but only 12 for the game? If the problem is risqué lyrics, you would have thought it would be the other way round.

More on Debian’s OpenSSL bungle

I reported on this in the Guardian. Interesting piece to research. First, the history. You can find the exchange between Karl Roeckx and Ulf Möller here. An unfortunate mistake; I make mistakes too (it was my fault that a name was misspelt in the Guardian piece, for example), so rather than heap blame on individuals I suggest this is more about a problem with the process; the only people making significant changes to the source code of such an critical library should be the committers responsible for that library. No doubt the incident is prompting a review of the process for updating Debian, Ubuntu and other distros; perhaps we will end up with a slower but less vulnerable flow of updates.

Second, a remark from Tim Callan at Verisign which there was not room for in this piece. I asked him whether Verisign knows which of the certificates it has issued are bad. “Unfortunately we don’t have those key pairs to look at them and scan them and tell which ones are good and which ones are not,” he told me. All Verisign can do is to ask its customers to check, which Callan says it is doing “very very aggressively.” In mitigation, Verisign does have a record of what operating system was used to purchase the certificate, but this is not the same thing; it is an imperfect process. The only fix is to revoke and replace the bad ones, which the company is offering to do for free.

Third, there are two distinct risks here. First, weak SSL certificates. Versign is embarrassed because it has been issuing weak certificates; its core product has been undermined. However, according to Netcraft, of the 870,000 secure web servers on the Internet, only 20,000 report themselves as Debian and 4,000 as Ubuntu. The true figure will be somewhat more than that, but that is a relatively small proportion; and exploiting the weakness takes a bit of effort.

The second problem is the possibility of intercepting or cracking SSH tunnels used to administer affected servers. We saw this demonstrated at a hacking briefing run by NCC Group yesterday. Let’s assume that administrators use SSH authenticated with a private key – a common scenario – and that the key was generated by the faulty Open SSL library. I suspect this will have been true for many more than 20,000 servers, though a lot will now have been fixed. All you need to do is to run a script against that server armed with a list of the possible keys – under a thousand, according to the demo we saw*. When you get a hit, you can connect to that server, most likely with full root permissions.

The most hardened servers will not be so easy to crack. They will authenticate as a user with limited rights, and use su to elevate. They will limit access to specific IP addresses. They will use additional passphrases. And they will have changed the keys within hours of the problem being discovered.

Still, there are plenty of less secure servers out there, so what that means is that an unknown number of servers will have been compromised, and more will follow. If you are lucky, the intruders will hack your website and do obvious damage so the server will get cleaned up. If you are unlucky, the intruder will be discreet and quietly start stealing credit card numbers, or taking advantage of any information or privileges obtained to get access to additional servers or data, or make occasional use of the server in botnet attacks. Who knows?

Servers getting rooted is not a new problem; and it’s not yet clear whether this incident is more than a ripple. Colin Phipps at Netcraft doesn’t think it is. “We’ll see a lot of panicked system administrators,” he told me, “and we’ll see a lot of scepticism about open source.” That last point is probably the most significant.

*I’m told this was artificially reduced for the demo – but there are only 32,676 keys possible private keys to brute force access. However, even using the full set of 2048-bit RSA keys NCC Group successfully broke into a system which used Debian to generate SSH keys in 20 minutes, and think it could often be done in half that time.

ODF support in Microsoft Office: a sign of strength, or weakness?

Big news in the document format wars today. Microsoft is (as far as I can tell) properly supporting ODF in Office. The press release states that both ODF and PDF will be fully integrated into Word, Excel and PowerPoint. This means Save As, not Export; and the possibility of setting ODF as a default save format.

The release adds:

Microsoft will join the Organization for the Advancement of Structured Information Standards (OASIS) technical committee working on the next version of ODF and will take part in the ISO/IEC working group being formed to work on ODF maintenance.

Reading the release, and comments by Doug Mahugh, it looks as if this is different code from the hopeless CleverAge translator, an open source project on SourceForge. That uses XSLT, which is inefficient for large documents and always seemed to me the wrong approach to take.

It seems that despite achieving ISO standardization for its own Open XML format, Microsoft is responding to pressure from large customers, especially in government and education, who want full ODF support.

Having said that, there are bound to be technical issues over the import and export. We have to wait to see the list of what may be converted incorrectly, or is not supported.

Let’s presume Microsoft has done a good job. Is this good for the company, or bad? Open Office does not support Open XML (don’t you love how everything is called “Open”), so this boosts ODF and therefore Open Office by making it more widely compatible. On the other hand, it could avoid lost sales to customers who would otherwise abandon Microsoft Office for lack of ODF support, which helps Microsoft. In the end, it’s hard to say how this will play out in terms of market share.

That said, it is undoubtedly good for users. Kudos to Microsoft for doing something to make their lives easier.

Technorati tags: , , , ,

Microsoft Access needs a complete rethink – or retirement

Microsoft Access is now thoroughly out of sync with the company’s wider database technology. I’m writing an introductory piece on database applications, and the failure of Access to keep pace with what is happening elsewhere is glaringly apparent.

Let’s look at what database formats Access understands. There is its own native .MDB format, now ACCDB. ACCDB is the updated and incompatible form of MDB introduced in Access 2007. Then there is SQL Server via the Access Data Project, now deprecated, which connects with OLEDB. Then there is the possibility of linking to external data via an MDB or ACCDB, which means ODBC or ancient drivers for things like dBase and Paradox. Finally, there are some special drivers for Excel and for Sharepoint, which do not interest me greatly in this context.

What’s missing from this picture? Primarily ADO.NET, the core database technology in the .NET Framework. For example, what if you want to connect to a SQL Server Compact Edition database using Access? Microsoft in its wisdom does not provide an ODBC driver for SQL Server Compact Edition. There is an OLEDB driver, but you can only use this from VBA, not with the interactive Access user interface. In effect, Access is hopeless for working with SQL Server Compact Edition, which is a shame because this is an otherwise attractive choice for a file-based desktop database. There is an ADO.NET provider of course; but Access cannot use it because it does not understand .NET.

Microsoft gets grief from time to time over why it does not use the .NET Framework for its own Office applications. Although the Visual Studio Tools for Office (VSTO) link Office somewhat effectively with Word, Excel and Outlook, the core applications are native code, and the core macro language for them is Visual Basic for Applications (VBA), essentially the same language Microsoft retired for general development back in 2001 when VB.NET appeared. Access also has its own form engine, also ancient; it does even use standard VBA forms.

While there are good reasons why Office remains native code, it is Access that has suffered the most from the lack of .NET. It seems to me that Microsoft should either rebuild the product using the .NET Framework; or retire it. I suppose it could also do some clever integration work, adding .NET language and forms to the product, but for the effort involved it hardly seems worth it.

I have never much enjoyed programming with Access, but used to like it for interactive work and reports. I rarely use it now, for the reasons stated above.

The problems with Access hit home and small business users who start off with Microsoft Office and build a custom database, most likely an MDB or ACCDB. At some point they want to take it to the next step, maybe as it becomes a more sophisticated application, or needs to support more users, or be migrated to the Web. They then need to abandon most of their work, exporting the data and starting again. It’s become an embarrassment; it needs a complete rethink, or retirement.

Who needs AIR? NY Times does desktop Silverlight app for Mac

The New York Times is porting its excellent Times Reader application to the Mac using Silverlight 1.0:

Times Reader for the Mac is a native Cocoa application, which uses the Safari toolkit and Silverlight to render the pages.

Follow the link for some screengrabs. Adobe’s AIR (which also uses the Safari toolkit) is the obvious choice for this kind of online app; it’s interesting to see the NY Times adapting Silverlight in a similar manner.

I spoke to developer Nick Thuesen about this at Mix07, so this is not news for readers of this blog; though I’d become sceptical about whether it would be delivered because of the delay. Now, I’m surprised that the NY Times is still using Silverlight 1.0 rather than waiting for 2.0.

The Silverlight version appears to have some compromises. In particular, it cannot flow text on the client:

We paginate the pages for the Mac version on our servers (the Windows version does this on the PC). When you sync, we send you pages for the four window and three font sizes described above.

Still, the screens look good and I look forward to trying it – especially as the public beta will be free, whereas you need a subscription for the full release.

There is a high level of hostility towards Silverlight in the comments to the post. Mostly these appear to be religious in nature – ie. Mac users hate all things Microsoft. It does illustrate the difficulty the company has in persuading the world to take its cross-platform ambitions seriously.

Thanks to Ryan Stewart for the link.

Microsoft: forget the Live Search Cashback, just improve the engine

Microsoft is paying users to use its search engine with a new search cashback scheme. Looks like an affiliate scheme where the commission is paid back to the customer. US only.

I think Microsoft should focus on improving its search engine. This morning, I needed to call a local electrician and figured that search would be quicker than using a phone book. I entered the name of the retailer and the town. For some reason, this stymied Live Search: the result I was looking for was not on the first 10 pages. Identical search on Google: the first four results matched, and the address and telephone number were at the top of the page with a little map.

In a poll last year 51% thought Google delivered the best results for an example search, while 35% preferred Live Search and 31% Yahoo. That’s an inconclusive result, and this is not an exact science; but personally I find Google almost always delivers better results, sometimes (as in the case this morning) dramatically so.

If Microsoft managed to reverse this I would switch to Live Search in a heartbeat.

Technorati tags: , , , ,

Why we don’t talk about Zune

Brandon LeBlanc comments on last week’s Guardian article on DRM and says:

What is interesting to me is the article neglects to look at what Microsoft is doing with Zune in regards to DRM. Just like Apple and Amazon – the Zune Marketplace also offers DRM-free music.

According to this page on the Zune Marketplace:

Browse over three million songs you can preview and download—most are now available as MP3s that’ll play on your Zune device or any other MP3 player. Or get an instant music collection: Zune Pass gets you millions of downloads for just $14.99 a month.

Answering LeBlanc, one reason is that Microsoft has not made Zune available internationally, so its visibility in the UK is rather minimal. Nevertheless, the Zune developments are interesting. In fact, the Zune now has pretty much the business model many expect Apple’s iTunes and iPod/iPhone to have in the future – all-you-can-eat subscription, with a premium download option.

Still, Microsoft has a marketing problem with Zune. First, it’s perceived as a me-too answer to iTunes/iPod. Second, the branding is focused firmly on the Zune device, which has only a small market share. Amazon on the other hand makes great play of the iPod compatibility of its MP3 store. How can Microsoft promote Zune marketplace as a source for DRM-free iPod music, without undermining the whole Zune concept in which device and store are tied tightly together?

Technorati tags: , , , , ,

Installing .NET, PowerShell on Windows 2008 Server Core: it can be done

Dmitry Sotnikov explains how to install .NET and PowerShell on Windows Server 2008 in its Server Core configuration. It is necessary to tweak the .NET setup with Orca, a low-level editor for Microsoft Installer files. Note this is unsupported.

The lack of PowerShell is an annoyance; the lack of .NET is a major obstacle to making use of Server Core, so this is interesting work. Sotnikov does not say whether ASP.NET springs to life; I presume it may be possible.

I imagine that one of the issues with .NET on Server Core is that some parts of the Framework will not work because dependencies are missing. Server Core has little in the way of a GUI, so I would not expect System.Windows.Forms or Windows Presentation Foundation to work; yet the .NET runtime is all or nothing. This is changing; Microsoft has announced a Client Profile Setup to reduce the runtime size in .NET 3.5 SP1, for client applications.

What we now need is a Server Profile, tailored to work on Server Core.