What is Microsoft’s new language?

From Douglas Purdy’s blog:

It is not very often that you get to be part of a team that is developing a programming language that aspires to be used by every developer on the Microsoft platform.

In addition, it is not very often that you can be part of a team that aspires to radically change the dynamics of building a new language, to the extent that a developer can write their own model-driven language in a straightforward way while getting all the language services (Intellisense, colorization, etc.) for “free”.

I am lucky enough to be on such a team – and if you are interested you could be as well.

Something to do with Oslo I guess. And Live Mesh?

All will be revealed at PDC.

Technorati tags: , , , ,

Role of web video in tech communications

Last week’s Live Mesh announcement was a significant one for Microsoft watchers. It was interesting to note that all the in-depth information came in the form of web video.

Personally I dislike this trend. Video cannot easily be scanned to see what it contains; it also requires audio which is a nuisance. It is more work to quote from a video that to copy some text. I also resort to playing them at double speed where possible, to come closer to the speed of reading, and noting down the time of sections that I want to return to.

Some of these problems could be mitigated by better presentation. For example, you could have summary text on the page next to an embedded video, with links to indexed points.

However I also recognize that I may be in a minority. Video has obvious advantages; it is more informal, and can includes real demos as opposed to diagrams and screen grabs.

I am even contemplating trying some video publishing of my own; it is time I reviewed Adobe Visual Communicator.

Even so, I’d suggest that companies take the time to offer transcripts of important video content. Text has advantages too.

Microsoft Live Mesh is AIR++

This post on the Microsoft Live Dev blog reminded me to view some of the Live Mesh videos Microsoft has put out for developers – this quick tour is a good place to start; this video with Ori Amiga has more details with examples.

A few comments. First, it seems to me that Live Mesh is at heart a feed aggregrator. It is interesting to me because I had high hopes for Microsoft’s plans to integrate RSS into the operating system, and wrote about it in 2005. Sadly, Microsoft messed up its common feed platform – though I am perhaps one of the few who uses it outside IE7 or Outlook, with a custom feed reader thrown together in VB.

Live Mesh takes the feed aggregation concept and adds a few things. These include a REST API for posts and updates; a synchronization engine; an identity system so that you can control access; and a local feed server that works entirely offline when needed. Hence MOE (Mesh Operating Environment), also known as the Service Composition Runtime.

By the way, Mesh can synch peer to peer as well as with the cloud hub. Interesting for Intranet usage.

So what’s an application? A feed of course, one that contains stuff you can execute. The local runtime could be just HTML and Javascript engine; but you can see how nicely Silverlight fits into this scheme of things. It’s a neat deployment model. Buying an application becomes similar to subscribing to a web site, except you get an executable that works offline as well as online. As Amiga explains in the video above, this is about performance as well as convenience. The speed of the Net cannot match a local store.

Another aspect of this is that you can use Mesh services in your non-Mesh application, essentially as a data source that is automatically synchronized across everywhere.

If I’m anywhere close to grasping this, then it is not inherently Windows-centric. It also strikes me that this is AIR++, where the ++ is services and synchronization; Adobe should worry – except that Adobe has AIR out already and is no doubt working on great things for version 2.0.

A question though: what’s the business model? Commercial MESHable services? Tools and hosting? Premium MESH? MESH with ads? Right now, I guess Microsoft will do anything to buy mind share and market share for cloud services; but that will not do long-term.

Schwartz vs Mickos on MySQL and open source

At least, that’s how it looks. I was intrigued when I saw reports raising the possibility of “high-end” features in MySQL being released under a closed-source license – confirmed (as a possibility) in a roundabout way here. I found it odd because Sun CEO Jonathan Schwartz had told me of Sun’s intention to open source everything.

So what does Schwartz think of the MySQL idea? Not much, according to his statement in this email interview with Tim O’Reilly:

Marten Mickos (SVP, Database Group at Sun, former CEO, MySQL) made some comments saying he was considering making available certain MySQL add-ons to MySQL Enterprise subscribers only – and as I said on stage, leaders at Sun have the autonomy to do what they think is right to maximize their business value – so long as they remember their responsibility to the corporation and all of its communities (from shareholders to developers). Not just their silo.

I think Marten got some fairly direct and immediate feedback saying the idea was a bad one – and we have no plans whatever of “hiding the ball,” of keeping any technology from the community. Everything Sun delivers will be freely available, via a free and open license (either GPL, LGPL or Mozilla/CDDL), to the community.

Everything.

No exception.

Seems clear enough to me.

Office 2007: what do you lose by setting binary formats as default?

I wrote a piece for IT Week about document format defaults in Office 2007. The problem is that users with Office 2007 start emailing documents to others who do not have the suite. It is not too bad for other Microsoft Office users, who can download a compatibility pack, but for users of other operating systems it is problematic, though there are online services like zamzar.com.

Someone who read it pointed out that with the binary formats Office 2007 works in “compatibility mode”. Doesn’t this lose most of the benefits of upgrading to Office 2007?

My suggestion: try it, and let me know what features you miss.

As far as I can tell, the two biggest issues in Word are with the equation editor, and themes. The equation editor is disabled after you save a documents in .doc format, and themes are converted to styles. Personally I prefer styles to themes, and I rarely use the equation editor, so it is no loss to me. Further, it is not a problem doing Save As if you want to use some special feature like the equation editor. For sure, it beats getting that phone call when you are out of the office the next day, “That document you sent, it won’t open.”

What about Excel? Help says, “any new or enhanced Excel 2007 features are not available” in compatibility mode. I presume that would include the new larger sheet size. However, I’ve not bothered to convert any of my existing workbooks because I don’t actually notice any difference.

The ribbon, which is is the big new feature in Office 2007, works the same whatever format you use.

Still, it is a fair point. If you find it easy to do a Save As for documents that need to be shared with users not using Office 2007, then there is no problem using the new formats.

But how do you enforce that across an enterprise? Not easy; and of course Windows Explorer hides the extension by default. Documents in the old format are described as:

Microsoft Office Word 97-2003 Document

instead of

Microsoft Office Word Document

though unless you have a wide column size you might well see them both as “Microsoft Offi”, thanks to a particularly user-hostile naming convention.

However, you can set the default save format across an enterprise, with group policy. To my mind, that’s better than sending out stuff that is unreadable.

Buying a Microsoft code-signing certificate from Thawte? Don’t use Vista.

Here’s the problem. You go along to http://www.thawte.com and ask to buy a Microsoft authenticode certificate. It’s the right thing to do; signing code is increasingly important in these days of Internet delivery of applications; and unsigned code presents the user with dire warnings that may unnerve them.

So you go to buy a certificate. The way this works is in two stages. When you apply for the certificate, you are issued with a new private key, but not the certificate itself. Thawte then does its due diligence and checks out that you really do represent the organization for which you are requesting a certificate. Finally, you can go back and download the certificate and get on with signing your apps.

This process works differently on Vista than on XP. I got this wrong when I first tried it, because it is not obvious. To begin with, you have to relax IE’s security for the thawte site – ironic, for a security operation – and make sure it is not running in protected mode. Next, the first page of the application is a big form that has the details of the organization, how you are going to pay, and so on. If you complete this on Vista, and click Submit, you get a message saying “This web site is requesting a new certificate on your behalf”:

 

You complete the application, sit back and wait. A few days later you get an email saying your certificate is ready for download. You download it; it is a file called something like mycert.spc. You can right-click and choose Install Certificate, to place it in the Windows certificate store. You can even sign code with it. Just open a Visual Studio command prompt, type:

signtool signwizard

and off you go. You can select the new certificate from your certificate store, timestamp the code (recommended), and you’re done.

So what’s the problem? Well, what if you want to sign code on a different machine than the one on which you applied for the certificate? And what if you want to back up your certificate?

Did you realise when you made the purchase that you were irretrievably hooking the certificate to the actual Vista installation which you were using for the transaction?

It is all to do with the private key. To sign code, you need the private key, which was installed into your certificate store when that first page of the application was submitted. Unfortunately it cannot be exported; it is marked as non-exportable, which means the Export feature of Vista’s Certificate Manager will not allow the private key to be exported. Thawte cannot re-issue the private key; the only solution I know of is to get the entire certificate revoked reissued (fortunately this is a free service).

This problem does not occur on Windows XP. Here is the evidence. The screenshot below shows part of the application form on Vista:

Now, here is the same part of the form on Windows XP (still IE7):

Spot the difference? An additional section appears in XP, which lets you specify where to save your private key as a file with a .pvk extension. On Vista, you don’t get that choice and you don’t get a .pvk file. Once you have both the .pvk and the .spc files, you can backup or move the certificate wherever you want, with full signing capability. You can import the the certificate plus private key into your certificate store using this tool:

http://www.microsoft.com/downloads/details.aspx?FamilyID=F9992C94-B129-46BC-B240-414BDFF679A7&displaylang=EN

which is billed as a tool for Office 2000, but works fine for this purpose.

Now, I guess this is a security feature. If you have these private key files hanging around, they are easier to steal than if they are locked into your certificate store and marked non-exportable. Fair enough, but I’d rather make that decision for myself, than have it imposed by an obscure installation process.

Vista SP1 vs Server 2008 as a desktop OS: more comparisons

I’ve been intrigued by reports that Server 2008, suitably configured, makes a better desktop OS than Windows Vista. In my previous post on the subject, I reported some observations by others, suggesting that Server 2008 performs better than Vista with Service Pack 1, even though it is meant to have the same core components. I though it was time I took a look myself.

I have some free space on my usual desktop box, so I created two new partitions and installed Vista 32-bit with Service Pack 1 on the first, and Server 2008 32-bit on the other.

Aside: Both installs were smooth. The integrated Vista SP1 install works nicely, and few updates were required after the first boot. It is remarkable how much more pleasant it is to install Vista from scratch, instead of dealing with an OEM pre-install. Surely it should be the other way round?

I tried to make both installs usable desktops. On both operating systems, I installed the driver for my Terratec soundcard, along with Intel’s .INF installer for the motherboard, Management Engine Interface, and storage driver. I also installed a recent NVidia driver. The result was that all devices were enabled in device manager.

On Server 2008 I also installed the Desktop Experience and .NET Framework 3.0. I enabled the network, the audio engine, the Themes service, Windows Update, and Aero graphics. I created a new user account and logged in as that user, so that UAC (User Account Control) was active. I set it to optimize performance for programs rather than background services.

Next I ran the PassMark performance tests I’ve used before. Advantage Server 08 – but not by much. It scored 1118.3 vs Vista’s 1102.3. I doubt this is significant; there is also small variation between different runs, which could account for a difference like this.

Looking at the detailed results shows something intriguing though. On the Graphics 2D GUI test, which exercises Windows controls like listboxes, checkboxes and dropdowns, Server 2008 scored 149.8 operations per second, vs 119.2 on Vista – more than 25% faster. I hesitate to attach much significance to my simple tests, but that might account for a snappier feel in the user interface. I repeated this particular test several times; Vista never scored higher than 123, and Server 2008 was consistent too.

There was also a notable difference in the “Memory – Large RAM” test. Vista 32-bit performed 802 operations per second, Server 08 1074: just over 33% faster.

On most tests, Vista was slightly slower, though on the disk tests it was fractionally faster. There were no other differences as big as the above.

I thought it would be interested to compare the list of running services on the two machines, after the changes mentioned above. Here are the services I spotted running on Vista but not Server 2008:

  • Computer Browser
  • Offline Files
  • Portable Device Enumerator
  • Program Compatibility Assistant
  • ReadyBoost
  • Security Center
  • SSDP Discovery
  • Superfetch
  • UPNP Device Host
  • Windows Connect
  • Windows Image Acquisition
  • Windows Search

and on Server 2008 but not Vista:

  • Remote registry
  • SL UI Notification
  • Windows Remote Management

So how would it be if Vista did not have the burden of these additional services? I stopped them. Result: no significant difference; the overall score was 1102.

Tentative conclusions

Benchmarks are not always a good measure of real-world performance. There are aspects of performance which the benchmark does not measure. In addition, some of the perceived advantage of Server 2008 is likely to be the effect of a new clean installation – never forget Windows Cruft.

Even so, on my particular system (Intel board, Core 2 Quad Q6600 CPU, NVidia 6800 graphics) Server 2008 does measure better. I’m particularly intrigued by the Graphics 2D GUI results. I do not know why Server 2008 is faster; but look forward to the same improvement appearing in desktop Windows in due course.

Update – 2D performance difference solved

I’ve worked out the reason for the difference in Graphics 2D GUI performance. It is because Server 2008 defaults to different settings for visual effects. You can see these by right-clicking Computer in the Start menu, choosing Properties, Advanced System Settings, Advanced tab, Settings, Performance options. I am sure there are other routes to the same dialog, some of which may be less arduous.

If I set these to Adjust for Best Performance on both systems, Vista actually goes ahead of Server 2008, with a score of 180 vs 172 on Graphics 2D GUI. That’s not much to worry about.

I’m satisfied that the performance differences between Server 2008 and Vista are mainly about configuration, rather than core components. If you want to speed up your own desktop, these settings are a good candidate for experimentation.

Technorati tags: , ,

What to say about Ubuntu Hardy Heron?

I installed Ubuntu Hardy Heron, a “long term support” release which went final yesterday.

It’s a tricky thing to assess. There are in general two things to say about Linux. First, you can take the line that it is a wonderful thing: free, fast, responsive and capable. You can do your work on this, even run a business on it. You can write applications in Java, C# or any number of other languages. You can have fun with it too – it’s great for multimedia, just a shame that few games support it. Finally, it is nice to know that most of the world’s malware is targetting someone else’s operating system.

Alternatively, you can argue that Linux is fiddly, perplexing, over-complicated, inconsistent, and still not ready for the general public.

It is tempting to give Ubuntu an easy ride because it is free and because we so much want it to succeed; we need an alternative to the Microsoft tax or the Apple tax. Unfortunately you never have to look far to find little problems or things that should be easy but end up consuming considerable effort.

Here’s one thing I noticed today. Close FireFox. Open  the Help Centre, and click a web link. The Help Centre opens FireFox with the link you requested, but then cannot be used until you close the FireFox instance. Trying to close it brings up a “Not responding” message. If FireFox was already running when you clicked the link, it is fine.

Here is another. Open Help Centre, click Playing Music, then Listen to online audio streams. It says I can install Real Player 10 and that it is available from the “commercial respository”. What is the “commercial” repository? This page describes four Ubuntu repositories: main, restricted, universe and multiverse. Real Player is not in any of them. Further, if you try and install it using apt get, the following message appears:

Package realplayer is not available, but is referred to by another package. This may mean that the package is missing, has been obsoleted, or is only available from another source
E: Package realplayer has no installation candidate

Hey, it’s Linux. Just Google and you’ll find a way. Who needs Real Player anyway? But that’s not the point … the point is that these little issues crop up and make running Linux less fun for non-geeks.

Here’s another one: I tried GNU Chess. I poked around in Preferences and chose the 3D view. It said:

You are unable to play in 3D mode due to the following problems:
No Python OpenGL support
No Python GTKGLExt support

Please contact your system administrator to resolve these problems, until then you will be able to play chess in 2D mode.

Fair enough; it is a clear, accurate and informative message – aside from the bit about “contacting your system administrator” which sounds like it was borrowed from Windows. You can just about forgive it in business software, but this is a game.

I still love Ubuntu. This one installed easily and updates nicely; the fancy graphics effects work smoothly; and most important, the same machine which felt slow with Vista now seems more like a high-performance workstation.

In other words, it it easy to support either line of argument. Personally I veer towards the favourable view; but I doubt fear of Ubuntu is keeping anyone in Redmond awake at nights.

Microsoft: we might withdraw Yahoo offer

Chris Liddell, senior Vice President and CFO, speaking during yesterday’s earnings call:

As outlined in our recent letter to the Yahoo board, unless we make progress with Yahoo towards an agreement by this weekend, we will consider our alternatives. We will provide updates as appropriate next week. These alternatives clearly include taking an offer to Yahoo shareholders or to withdraw our proposal and focus on other opportunities, both organic and inorganic.

Personally I think the Yahoo deal would be bad for Microsoft. I think it is driven by financial people trying to sum two market shares in search; but it is not so simple. My view is based on problems of integration, morale and culture, plus the risk of further confusing an Internet strategy that is already opaque.

Although Microsoft continues to be trounced in search (not least because it is simply not as good as its competition), there are signs of progress elsewhere. Another snippet from the earnings call: General Manager Colleen Healy mentioned that Live ID take-up is up by 18% to 448 million. No doubt many of those will be worthless accounts, but not all of them. Revenue from online business is up. Organic growth and smaller acquisitions would work better for the company.

Technorati tags: ,