Microsoft’s muddled licensing for Office Web Apps

I’ve been reviewing Microsoft’s Small Business Server 2011 – mainly the standard edition as that is the one that is finished. The more interesting cloud-oriented Essentials version is not coming until sometime next year.

In its marketing [pdf] for SBS 2011 Microsoft says:

Get things done from virtually wherever and whenever. With Office Web Apps (included in SharePoint Foundation 2010), users can view, create, and edit documents anyplace with an Internet connection.

This appears to be only a half-truth. You can install Office Web Apps into SharePoint Foundation 2010, but it is not included in a default install of SBS 2011 Standard, and as far as I can tell the setup for it is not on the DVD. If you try to download it, you will find it is only available through the Volume Licensing Service Center, and that you require a volume license for Microsoft Office to get it. You can also get it through TechNet, but this is for evaluation only.

The Office Web Apps site states:

Business customers licensed for Microsoft Office 2010 through a Volume Licensing program can run Office Web Apps on-premises on a server running Microsoft SharePoint Foundation 2010 or Microsoft SharePoint Server 2010.

and it also appears that each user requires a volume license for desktop Microsoft Office in order to use it. In other words, the Client Access License for Office Web Apps is a volume license for Office. You cannot purchase a volume license for 5 users, and then have everyone in your 50-person organisation use it.

This approach to licensing makes no sense. In fact, I’m not sure it is even internally consistent. Part of the web app concept is that you could, if need be, walk up to a PC in an internet cafe, log in to SharePoint, and make a quick edit to a Word document. You are not going to ask the management “is this machine correctly licensed for Office Web Apps?”

What if you are using Linux, or an Apple iPad (it almost works), or a RIM PlayBook, or some other device on which Office cannot be installed? These are scenarios where Office Web Apps is particularly useful; Microsoft cannot expect users to buy a license for desktop Office for machines which cannot run it.

Note Office Web Apps applications are severely cut-down in comparison to the desktop editions. It is not even close to the same thing. Further, Microsoft lets anyone in the world use Office Web Apps for free – provided it is on SkyDrive and not on a locally installed SharePoint.

Microsoft is also happy to give users of Office 365, the forthcoming hosted version of server apps including SharePoint, access to Office Web Apps:

Work from virtually any place and any device with the Office Web Apps

I’m guessing that somewhere in Microsoft the powerful Office group is insisting that Office Web Apps is a feature of the desktop product. Anyone else can see that it is not; it is a feature of SharePoint. Excluding it from SBS 2011 by default does nothing except to complicate matters for admins – and it is a fiddly install – thus reducing the appeal of the product.

Incidentally, I see nothing unreasonable about Microsoft charging for an on-premise install of Office Web Apps. But it should be licensed as a web application, not as a desktop application.

For more on this see Sharon Richardson’s post and Susan Bradley’s complaint.

Fixing a slow Windows XP PC

Yesterday I investigated a Windows XP machine that had become so slow it was unusable. It was a Dell Dimension 2350 with 1GB RAM and a 2.00 Ghz Celeron CPU – not too bad a spec for XP – that had been out of use for a while and was being brought back into service for a specific and undemanding task. At first it had performed fine, but after applying Service Pack 3 and installing Microsoft Security Essentials it had ground almost to a halt. The machine performed so badly that trying to troubleshoot it was like wading through glue. You could get task manager up and see plenty of RAM free, but the CPU was stuck on 100%.

After trying a few futile things like updating the BIOS, I installed Process Explorer and Process Monitor from Sysinternals. Looking at the activity summary in Proccess Monitor it was obvious which process was to blame: an instance of svchost.exe started with the command line: c:\windows\system32\svchost.exe –k netsvcs

However, netsvcs is responsible for many different services. I did a bit more poking around with Process Explorer and found the culprit: Windows Automatic Updates. Typing:

net stop wuauserv

at a command prompt fixed the problem temporarily.

It appears that the Windows Update database, which you can find in %windir%\Software Distribution\DataStore, can get corrupted. The Windows Update service goes into a spin and consumes all your computing resources. You can turn Automatic Updates off by right-clicking My Computer, Properties, and Automatic Updates tab; or you can fix it the brute-force way by deleting the DataStore folder and letting Windows recreate it, though you lose your update history; or you can try to repair the database.

Of course there are many reasons why Windows XP might run slowly, and often it is not easy to troubleshoot. There is abundant well-meaning advice on the internet, much of it based on the assumption that malware is involved, but finding the right answer to a particular problem is a matter of luck. In a professional context, it is hardly worth the time and corporates will just re-image the machine.

I do find it interesting that when Windows XP first appeared in 2001 it specified a minimum of 64MB RAM and ran OK in 128MB. Once fully patched with Service Pack 3, automatic updates, Internet Explorer 8 and anti-virus, it needs at least 512MB and in my experience 1GB to be comfortable. Unfortunately you have little choice; if you want to connect to the Internet or run recent applications, you have to update it. Automatic updates is a also a near-essential security feature.

Finally, kudos to the Sysinternals team whose tools are invaluable for solving this kind of problem.

Single sign-on from Active Directory to Windows Azure: big feature, still challenging

Microsoft has posted a white paper setting out what you need to do in order to have users who are signed on to a local Windows domain seamlessly use an Azure-hosted application, without having to sign in again.

I think this is a huge feature. Maintaining a single user directory is more secure and more robust than efforts to synchronise a local directory with a cloud-hosted directory, and this is a point of friction when it comes to adopting services such as Google Apps or Single sign-on with federated directory services takes that away. As an application developer, you can write code that looks the same as it would for a locally deployed application, but host it on Azure.

There is also a usability issue. Users hate having to sign in multiple times, and hate it even more if they have to maintain separate username/password combinations for different applications (though we all do).

The white paper explains how to use Active Directory Federation Services (ADFS) and Windows Identity Foundation (WIF, part of the .NET Framework) to achieve both single sign-on and access to user data across local network and cloud.


The snag? It is a complex process. The white paper has a walk-through, though to complete it you also need this guide on setting up ADFS and WIF. There are numerous steps, some of which are not obvious. Did you know that “.NET 4.0 has new behavior that, by default, will cause an error condition on a page request that contains a WS-Federation authentication token”?

Of course dealing with complexity is part of the job of a developer or system administrator. Then again, complexity also means more to remember and more to troubleshoot, and less incentive to try it out.

One of the reasons I am enthusiastic about Windows Small Business Server Essentials (codename Aurora) is that it promises to do single sign-on to the cloud in a truly user-friendly manner. According to a briefing I had from SBS technical product manager Michael Leworthy, cloud application vendors will supply “cloud integration modules,” connectors that you install into your SBS to get instant single sign-on integration.

SBS Essentials does run ADFS under the covers, but you will not need a 35-page guide to get it working, or so we are promised. I admit, I have not been able to test this feature yet, and aside from Microsoft’s BPOS/Office 365 I do not know how many online applications will support it.

Still, this is the kind of thing that will get single sign-on with Active Directory widely adopted.

Consider FaceBook Connect. Register your app with Facebook; write a few lines of JavaScript and PHP; and you can achieve the same results: single sign-on and access to user account information. Facebook knows that to get wide adoption for its identity platform it has to be easy to implement.

On Microsoft’s platform, another option is to join your Azure instance to the local domain. This is a feature of Azure Connect, currently in beta.

Are you using ADFS, with Azure or another platform? I would be interested to hear how it is going.

Microsoft inadvertently shares BPOS offline address books with other customers

According to an email I’ve seen, sent to customers of Microsoft BPOS (Business Productivity Online Suite), some users have found their Offline Address Book – an Exchange feature which stores a company’s internal address list – has been downloaded by other BPOS customers:

Microsoft recently became aware that, due to a configuration issue, Offline Address Book information for Business Productivity Online Suite–Standard customers could be inadvertently downloaded by other customers of the service, in a very specific circumstance. The issue was resolved within two hours of identification, and we completed a thorough review of processes to prevent this type of issue from occurring again. Our records indicate that a very small number of downloads actually occurred, and we are working with those few customers to remove the files.

This issue affected only Business Productivity Online Suite–Standard customers; no other Microsoft Online Services were affected.

Big deal? Probably not, especially as customer address lists, which might be useful to competitors, are not normally included in an Offline Address Book.

That said, any leakage of data from one customer to another is a serious issue, as it is exactly this possibility which deters users from using cloud services in the first place. It is an inherent hazard of multi-tenancy.

Still, kudos to Microsoft for owning up.

Adobe declares glittering results as CEO says Apple’s Flash ban has no impact on its revenue

Adobe has proudly declared its first billion dollar quarter, $1,008 m in the quarter ending Dec 3 2010 versus $757.3 m in the same quarter of 2009.

I am not a financial analyst, but a few things leap out from the figures. One is that Omniture, the analytics company Adobe acquired at the end of 2009, is doing well and contributing significantly to Adobe’s revenue – $98.4 m in Q4 2010. The billion dollar quarter would not have happened without it. Second, Creative Suite 5 is selling well, better than Creative Suite 4.

Creative Suite 4 was released in October 2008, and Creative Suite 5 in April 2010. It is not perfect, but the following table compares the Creative Solutions segment (mainly Creative Suite) of the two products quarter by quarter from their respective release dates:

Quarters after release 1st 2nd 3rd 4th 5th 6th
Creative Suite 4 508.7 460.7 411.7 400.4 429.30 432.0
Creative Suite 5 532.7 549.7 542.1      

CS4 drops off noticeably following an initial surge, whereas CS5 has kept on selling. It is a good product and a de-facto industry standard, but not every user is persuaded to upgrade every time a new release appears. My guess is that things like better 64-bit support – which make a huge difference in the production tools – and new tricks in PhotoShop have been successful in driving upgrades to CS5. Further, the explosion of premium mobile devices led by Apple’s iPhone and iPad has not been bad for Adobe despite Apple CEO Steve Jobs doing his best to put down Flash. Publishers creating media for the iPad, for example, will most likely use Adobe’s tools to do so. CEO Shantanu Narayen said in the earnings calls, “We have not seen any impact on our revenue from Apple’s choice [to not support Flash]”, though I am sure he would make a big deal of it if Apple were to change its mind.

Before getting too carried away though, I note that Creative Suite 3, published in March 2007, did just as well as CS5.  Here are the figures:

Quarters after release 1st 2nd 3rd 4th 5th 6th
Creative Suite 3 436.6 545.5 570.5 543.5 527.2 493.6

In fact, Q4 2007 at $570.5 m is still a record for Adobe’s Creative Solutions segment. So maybe CS4 was an unfortunate blip. Then again, not quite all the revenue in Creative Solutions is the suite; it also includes Flash Platform services such as media streaming. Further, the economy looked rosier in 2007.

Here is the quarter vs quarter comparison over the whole company:

  Q4 2009 Q4 2010
Creative Solutions 429.3 542.1
Digital Enterprise 211.8 274.10
Omniture 26.3 98.4
Platform 47 46.1
Print and publishing 42.9 47.3

In this table, Creative Solutions has already been mentioned. Digital Enterprise, formerly called Business Productivity, includes Acrobat, LiveCycle and Connect web conferencing. Platform is confusing; according to the Q4 09 datasheet it includes the developer tools, Flash Platform Services and ColdFusion. However, the Q4 10 datasheet omits any list of products for Platform, though it includes them for the other segments, and lists ColdFusion under Print and Publishing along with Director, Contribute, PostScript, eLearning Suite and some other older products. According to this document [pdf] InDesign which is huge in print publishing is not included in Print and Publishing, so I guess it is in Creative Solutions.

In the earnings call, Adobe’s Mark Garrett did mention Platform, and attributed its growth (compared to Q3 2010) to “higher toolbar distribution revenue driven primarily by the release of the new Adobe Reader version 10 in the quarter.” This refers to the vile practice of foisting a third-party toolbar (unless they opt-out) on people forced to download Adobe Reader because they have been send a PDF. Perhaps in the light of these good results Adobe could be persuaded to stop doing so?

I am not sure how much this breakdown can be trusted as it makes little sense to me. Do not take the segment names too seriously then; but they are all we have when it comes to trying to compare like with like.

Still, clearly Adobe is doing well and has successfully steered around some nasty rocks that Apple threw in its way. I imagine that Microsoft’s decision to retreat from its efforts to establish Silverlight as a cross-platform rival to Flash has also helped build confidence in Adobe’s platform. The company’s point of vulnerability is its dependence on shrink-wrap software for the majority of its revenue; projects like the abandoned Rome show that Adobe knows how to move towards cloud-deployed, subscription-based software but with business booming under its current model, and little sign of success for cloud projects like, you can understand why the company is in no hurry to change.  

Why Windows Installer pops up when you run an application

Warning: this post is about old Windows hassles; I’ve written it partly because some of us still need to run old versions of Windows and apps, and partly because it reminds me that Windows has in fact improved so that this sort of thing is less common, though there is still immense complexity under its surface which can leak out to cause you grief – especially for people like reviewers and developers who install lots of stuff.

I’ve been retreating to Windows XP recently, in order to tweak an old Visual Basic 6 application. VB6 can be persuaded to run on later versions of Windows, but it is not really happy there. I have an old XP installation that I migrated from a physical machine to a VM on Hyper-V.

I was annoyed to find that when I fired up VB 6, the Windows Installer would pop up – not for VB 6, but for Visual Studio 2005, which was also installed.


Worse still, after thrashing away for a bit it decided that it needed the original DVD:


I actually found the DVD and stuck it in. The installer ground away for ages with its deceptive progress bars – “20 seconds remaining” sitting there for 10 minutes – repeated what looked like a loop several times, then finally let me in to VB. All was well for the rest of that session; but after restarting the machine, if I started VB 6 the very same thing would happen again.

This annoyance is not confined to VB 6; it used to happen a lot in XP days, though in my experience it is much less common with Vista and Windows 7.

I investigated further. This article explains what happens:

What you see is the auto-repair feature of Windows Installer. When an application is launched, Windows Installer performs a health check in order to restore files or registry entries that may have been deleted. Such a health check is not only triggered by clicking a shortcut but also by other events, such as activation of a COM server. The events triggering a health check depend on the operating system.

When you see this auto-repair problem this means that Windows Installer came to the conclusion that some application is broken and needs to be repaired.

A good concept, but in practice one that often fails and causes frustration. The worst part of it is the lack of information. Look at the dialog above, which refers to “the feature you are trying to use”. But which feature? In my case, how can my VB 6 depend on a feature of Visual Studio 2005, which came later and does not include VB 6? In any case, it is a lie, since VB 6 works fine even after the installer fails to fix its missing feature.

Fortunately, the article explains how to troubleshoot. You go to the event viewer, application tab, and MsiInstaller entries will tell you which product and component raised the repair attempt. Unfortunately the component is identified by a GUID. What is it?

To find out, you can try Google, or you can use a utility that queries the Windows Installer database. The best I’ve found is a tool called msiinv; the script mentioned in the post above did not work. You can find msiinv described by Aaron Stebner here, with a download link. Note how Stebner had to change the download locations because they kept breaking; a constant frustration with troubleshooting Windows, as Microsoft regularly moves or removes articles and downloads even when they are still useful.

Running msiinv with its verbose option (which you will need) seems to pretty much dump the entire msi installer database to a text file. You can then search for these GUIDs and find out what they are. You may find even products listed that are not in Control Panel’s Add/Remove programs. You can remove these from the command line like this:

msiexec /x {GUID}

where GUID identifies the product to remove.

In my case I found beta versions of WinFX (which became .NET 3.0). I said this was old stuff! I removed them, restarted Windows, and VB6 started cleanly.

That still does not explain how they got hooked to VB6; the answer is probably somewhere in the msiinv output, but having fixed the issue I’m not inclined to spend more time on it.

Rumblings in the Subversion community as WANdisco claims to be “shaking it up”

Subversion is an open source version control system used by developers to manage source code; it was an improvement over CVS which it to some extent replaced. Everyone loved it until Linus Torvalds came up with an alternative called Git which is better suited for the distributed development typical of large open source projects like Linux. Now everyone loves Git – with a bit of love left over for another distributed system called Mercurial – and Subversion has become a tad unfashionable, though still widely used.

David Richards is president and CEO of WANdisco, which has a source code management suite based on Subversion. He has announced his intention to “shake up Software Change Management” by fixing Subversion’s weak points. He writes:

Enough is enough. Subversion gets a lot of criticism due to the shortcomings of branching and merging, especially when compared with GIT and others, and we simply don’t have the time to debate whether or not this should be done when it clearly should be.

Why so combative? Well, there a few curious points here. Subversion was originally sponsored by Collabnet, which has its own ALM (Application Lifecycle Management) suite which uses Subversion, as well as a free product called Subversion Edge which packages the official open source release with some convenient tools. Subversion did not become a top-level Apache project until February 17 2010. According to Richards, there is now competition between commercial companies to be seen as the primary Subversion sponsors. In a blog post today, Richard refers to “commercial interests that are dependent on the perception that they are the ones developing Subversion” and adds:

We also believe it’s unhelpful when certain unscrupulous committers decide to commit trivial changes in large files to simply get their stats up.

Richards feels that Subversion development has stagnated:

Didn’t the community just announce a road map? Yes they did, but that’s pretty much all that happened (and that really pisses us off.) The commit logs (code committed by developers to the project) tell the real story. We are not happy with the volume, speed or participation on the project right now. Blogging, or answering questions on user lists are important, but so is writing source code.

The not-so-veiled threat in Richards’ post is that he will fork Subversion if necessary. He says “we don’t believe that [forking] is necessary” but when he adds later that “we would prefer that this be a community effort” it seems clear that forking is an option.

Richard says that WANdisco held a “summit” of companies with a vested interest in Subversion and that there was “a common theme: branching and merging must improve.”

Personally I like Subversion though it is also obvious that Git is superior for many projects. Richards does not help his cause by accusing “GIT Fanatics” of being unfair in their criticisms.  The comments to his post are worth reading, for example:

I used Subversion without prejudice over CVS because it was better. Today I use Git without prejudice over the other two. Git has changed how I work. I use facilities in Git that are not possible with Subversion or CVS.

Developers who take this view will not care much now about Subversion. Still, a lot of people still use it, probably many more than those who use Git, and improvements would be welcome. I am not clear though why the CEO of WANdisco is sounding so embattled, or what politics or difference of opinion is dividing the Subversion community, and would like to hear more about it.

Update: clarified that Subversion Edge is a free product

Update 2: The Apache Software Foundation replies in a blog post:

WANdisco’s false implication that it is in some kind of steering position in Subversion development discredits the efforts of other contributors and companies … we welcome WANdisco’s involvement in Subversion, and failure on WANdisco’s part to address the above concerns will have no effect on the acceptance of technical work funded by WANdisco. We simply felt it necessary to clarify WANdisco’s role in Apache Subversion, for the benefit of our users and potential contributors.

First impressions of Google TV – get an Apple iPad instead?

I received a Google TV as an attendee at the Adobe MAX conference earlier this year; to be exact, a Logitech Revue. It is not yet available or customised for the UK, but with its universal power supply and standard HDMI connections it works OK, with some caveats.

The main snag with my evaluation is that I use a TV with built-in Freeview (over-the-air digital TV) and do not use a set top box. This is bad for Google TV, since it wants to sit between your set top box and your TV, with an HDMI in for the set top box and an HDMI out to your screen. Features like picture-in-picture, TV search, and the ability to choose a TV channel from within Google TV, depend on this. Without a set-top box you can only use Google TV for the web and apps.


I found myself comparing Google TV to Windows Media Center, which I have used extensively both directly attached to a TV, and over the network via Xbox 360. Windows Media Center gets round the set top box problem by having its own TV card. I actually like Windows Media Center a lot, though we had occasional glitches. If you have a PC connected directly, of course this also gives you the web on your TV. Sony’s PlayStation 3 also has a web browser with Adobe Flash support, as does Nintendo Wii though it is more basic.


What you get with Google TV is a small set top box – in my case it slipped unobtrusively onto a shelf below the TV, a wireless keyboard, an HDMI connector, and an IR blaster. Installation is straightforward and the box recognised my TV to the extent that it can turn it on and off via the keyboard. The IR blaster lets you position an infra-red transmitter optimally for any IR devices you want to control from Google TV – typically your set-top box.

I connected to the network through wi-fi initially, but for some reason this was glitchy and would lose the connection for no apparent reason. I plugged in an ethernet cable and all was well. This problem may be unique to my set-up, or something that gets a firmware fix, so no big deal.

There is a usability issue with the keyboard. This has a trackpad which operates a mouse pointer, under which are cursor keys and an OK button. You would think that the OK button represents a mouse click, but it does not. The mouse click button is at top left on the keyboard. Once I discovered this, the web browser (Chrome, of course) worked better. You do need the OK button for navigating the Google TV menus.

I also dislike having a keyboard floating around in the living room, though it can be useful especially for things like Gmail, Twitter or web forums on your TV. Another option is to control it from a mobile app on an Android smartphone.

The good news is that Google TV is excellent for playing web video on your TV. YouTube has a special “leanback” mode, optimised for viewing from a distance that works reasonably well, though amateur videos that look tolerable in a small frame in a web browser look terrible played full-screen in the living room. BBC iPlayer works well in on-demand mode; the download player would not install. Overall it was a bit better than the PS3, which is also pretty good for web video, but probably not by enough to justify the cost if you already have a PS3.

The bad news is that the rest of the Web on Google TV is disappointing. Fonts are blurry, and the resolution necessary to make a web page viewable from 12 feet back is often annoying. Flash works well, but Java seems to be absent.

Google also needs to put more thought into personalisation. The box encouraged me to set up a Google account, which will be necessary to purchase apps, giving me access to Gmail and so on; and I also set up the Twitter app. But typically the living room is a shared space: do you want, for example, a babysitter to have access to your Gmail and Twitter accounts? It needs some sort of profile management and log-in.

In general, the web experience you get by bringing your own laptop, netbook or iPad into the room is better than Google TV in most ways apart from web video. An iPad is similar in size to the Google TV keyboard.

Media on Google TV has potential, but is currently limited by the apps on offer. Logitech Media Player is supplied and is a DLNA client, so if you are lucky you will be able to play audio and video from something like a NAS (network attached storage) drive on your network. Codec support is limited.

In a sane, standardised world you would be able to stream music from Apple iTunes or a Squeezebox server to Google TV but we are not there yet.

One key feature of Google TV is for purchasing streamed videos from Netflix, Amazon VOD (Video on Demand) or Dish Network. I did not try this; they do not work yet in the UK. Reports are reasonably positive; but I do not think this is a big selling point since similar services are available by many other routes. 

Google TV is not in itself a DVR (Digital Video Recorder) but can control one.

All about the apps

Not too good so far then; but at some point you will be able to purchase apps from the Android marketplace – which is why attendees at the Adobe conference were given boxes. Nobody really knows what sort of impact apps for TV could have, and it seems to me that as a means of running apps – especially games – on a TV this unobtrusive device is promising.

Note that some TVs will come with Google TV built-in, solving the set top box issue, and if Google can make this a popular option it would have significant impact.

It is too early then to write it off; but it is a shame that Google has not learned the lesson of Apple, which is not to release a product until it is really ready.

Update: for the user’s perspective there is a mammoth thread on avsforum; I liked this post.

Don’t miss Ryan Dahl on Node.js

I’m just back from Dreamforce in San Francisco, where one of the sessions I enjoyed most was from Ryan Dahl in the Cloudstock pre-conference event.

He is the author of node.js, a binding for the V8 Javascript engine, not for running in the browser but for creating server apps. However, it is interesting even if you don’t want to use V8, because of the approach he takes to concurrency and I/O. I wrote up the session here, under the title Nginx the new Apache, node.js the new PHP?

What was Dahl doing at a Dreamforce conference? That was a question that puzzled me, until later in the week when it was announced that is acquiring Heroku. Heroku has been experimenting with running node.js on its hosted infrastructure for Ruby applications, and may come up with a Ruby wrapper.

The platform play

I’ve been mulling over the various announcements here at Dreamforce, which taken together attempt to transition from being a cloud CRM provider to becoming a cloud platform for generic applications. Of course this transition is not new – it began years ago with and the creation of the Apex language – and it might not be successful; but that is the aim, and this event is a pivotal moment with the announcement of and the Heroku acquisition.

One thing I’ve found interesting is that sees Microsoft Azure as its main competition in the cloud platform space – even though alternatives such as Google and Amazon are better known in this context. The reason is that Azure is perceived as an enterprise platform whereas Google and Amazon are seen more as commodity platforms. I’m not convinced that there is any technical justification for this view, but I can see that is reassuringly corporate in its approach, and that customers seem generally satisfied with the support they receive, whereas this is often an issue with other cloud platforms. is also more expensive of course.

The interesting twist here is that Heroku, which hosts Ruby applications, is more aligned with the Google/Amazon/open source community than with the corporate culture, and this divide has been a topic of much debate here. says it wants Heroku to continue running just as it has done, and that it will not interfere with its approach to pricing or the fact that it hosts on Amazon’s servers – though it may add other options. While I am sure this is the intention, the Heroku team is tiny compared to that of its acquirer, and some degree of change is inevitable.

The key thing from the point of view of is that Heroku remains equally attractive to developers, small or large. While has not failed exactly, it has not succeeded in attracting the diversity of developers that the company must have hoped for. Note that the revenue of remains 75%-80% from the CRM application, according to a briefing I had yesterday.

What is the benefit to of hosting thousands of Ruby developers? If they remain on Heroku as it is at the moment, probably not that much – other than the kudos of supporting a cool development platform. But I’m guessing the company anticipates that a proportion of those developers will want to move to the next level, using and taking advantage of its built-in security features which require user accounts on Note that features such as row-level security only work if you use the user directory. Once customers take that step, they have a significant commitment to the platform and integrating with other services such as Chatter for collaboration becomes easy.

The other angle on this is that the arrival of Heroku and VMForce gives existing customers the ability to write applications in full Java or Ruby rather than being restricted to tools like Visualforce and the Apex language. Of course they could do this before by using the web services API and hosting applications elsewhere, but now they will be able to do this entirely on the cloud platform.

That’s how the strategy looks to me; and it will fascinating to look back a year from now and see how it has played out. While it makes some sense, I am not sure how readily typical Heroku customers will transition to or the identity platform.

There is another way in which could win. Heroku knows how to appeal to developers, and in theory has a lot to teach the company about evangelising its platform to a new community.