All posts by onlyconnect

Xobni makes its point on the streets of LA

I’m in Los Angeles just before Microsoft’s Professional Developers conference, where one of the themes will be Office 2010 and its new features. Yesterday though the streets near the conference centre were full of “ambassadors” for Xobni, handing out T-shirts to attendees going to pre-conference events and promising a chance of cash prizes if you are seen wearing one of them at the show.

Xobni is an Outlook add-in (read the name backwards) that pulls out contact details in a side panel as you read your emails, complete with previous activities and social connections from Facebook, Twitter, Salesforce.com and elsewhere.

According to the ambassador I spoke to, there is a similar feature in Outlook 2010; and the company is hoping that the hall will be filled with Xobni T-shirts when this is announced.

You have to give the company credit for its initiative.

Every add-on vendor has this problem – what to do if your feature ends up baked into the main product.

Picture to follow.

Technorati Tags: ,,,

Have Windows OEM vendors learnt anything from Apple?

I’ve just set up a new consumer Windows 7 PC – it was HP’s Compaq Presario CQ5231UK, not bad value at £399 (VAT included) with Core 2 Duo E7500 (2.93 Ghz), 3GB RAM, Windows 7 Home Premium 64-bit – yes, 64-bit Windows really is mainstream now – 500GB hard drive and NVIDIA G210 graphics.

For comparison, the cheapest current Apple Mac is the Mini at £499 – it’s not directly comparable since its neat compact size is worth a premium, but it is slightly less well specified with slower processor, 2GB RAM and 160GB drive. As for an iMac, this comes with a screen but costs more than twice as much as the HP Compaq.

A good deal then; but have Microsoft’s efforts to make Windows 7 “quieter” and less intrusive been wrecked by OEM vendors who cannot resist bundling deals with 3rd parties, otherwise known as crapware?

I draw your attention to my interview with Microsoft’s Bill Buxton last year, when I raised this point. He said:

Everybody in that food chain gets it now. Everybody’s motivated to fix it. Thinking about the holistic experience is much easier now than it was two years ago.

I was interested therefore to see what sort of experience HP delivers with one of its new home PCs. Unfortunately I forgot to keep a list, but I removed a number of add-ons that the user agreed were unwanted, including:

I also removed a diagnostics tool called PC-Doctor and an HP utility that stuck itself prominently on the desktop, HP Advisor Dock. It is possible that these tools might in some circumstances be useful, though I’m wary. I have no idea why HP has decided to supply its own Dock accessory after Microsoft’s efforts with the Windows 7 Taskbar.

We left in place an application called HP Games which is a branded version of WildTangent ORB and includes some free games.

The short answer is that the Windows ecosystem has not changed. The deal is that your cheap PC is subsidised by the trialware that comes with it. Another issue is OEM utilities – like HP’s Advisor Dock – which jar with the careful design Microsoft put into Windows 7 and offer overlapping functionality with what is built in.

In mitigation, Windows 7 runs so well on current hardware that even this budget PC offers snappy performance. I also had no difficulty removing the unwanted add-ons. The speed of setup – number of restarts – was much better than I recall from the last Toshiba laptop I set up.

Nevertheless, on the basis of this example there is still work to do if the experience of starting with a Windows PC is to come close to that offered by the Mac. Further, bundling anti-malware software that requires a subscription is actually a security risk, since a proportion of users will not renew and therefore end up without updates. I would be interested in other reports.

Technorati Tags: ,,,,

Google’s new language: Go

Google has a new language. The language is called Go, though issue 9 on the bug tracker is from the inventor of another language called Go and asks for a name change. Co-inventor Rob Pike says [PDF] that Google’s Go is a response to the problem of long build times and uncontrolled dependencies; fast compilation is an important feature. It is a garbage-collected language with C-like syntax – echoes of Java and C# there – and has strong support for concurrency and communication. Pike’s examples in the paper referenced above do show a simple and effective approach to communication, with communication channel objects, and to concurrency, with Goroutines.

Go runs only on Linux or Mac OS X. I installed it on Ubuntu and successfully compiled and ran a one-line application. I used the 32-bit version, though apparently the 64-bit implementation is the most advanced.

Pike claims that performance is “typically within 10%-20% of C”. No debugger yet, but in preparation. No generics yet, but planned long-term. Pointers, but no pointer arithmetic.

Go does not support type inheritance, but “Rather than requiring the programmer to declare ahead of time that two types are related, in Go a type automatically satisfies any interface that specifies a subset of its methods.”

Google has many projects, and while Go looks significant, it is dangerous to make assumptions about its future importance.

I don’t think Google is doing this just to prove that it can; I think it is trying to solve some problems and doing so in an interesting way.

Technorati Tags: ,,,

Surveys are useless

I’m at Microsoft Tech-Ed in Berlin where 7000-odd IT admins and developers (though more admins) are looking at Microsoft technology.

I was browsing round the stands in the Technical Learning Centre here when I came to one where the technical documentation team at Microsoft was handing out a survey. Fill in the survey, get a plastic rocket. I looked through the survey where you had to rate innumerable aspects of the documentation on Microsoft’s technical resource sites (MDSN, TechNet etc).

I refused to complete it, on the grounds that it would not yield anything of value. I can put numbers in boxes as well as anyone else, but they tend to be arbitrary, and all too often the real answers cannot be easily condensed into a 1 to 5 rating. I said that the way to find out what people thought of the documentation was to ask them, not to get them putting numbers on a form.

Inevitably, the guys asked me that question, and we has a discussion of the issues I’ve found with the sites including:

  • Broken links. I don’t think Microsoft should ever delete a knowledgebase entry. Mark them obsolete or even wrong, but don’t remove them.
  • Too many locations with overlapping content – MSDN, Technet, specialist sites, team blogs etc.
  • Documentation that states the obvious – eg how to enable or disable a feature – but neglects to mention the interesting stuff like why you would want to enable or disable it and what the implications are.
  • Documentation that is excessively verbose or makes you drill down into link after link before finding the real content.
  • Documentation that is not clearly dated, so that you might be reading obsolete advice.

Anyway, I felt I had a worthwhile discussion and was listened to; whereas completing the survey would not have brought out these points effectively.

Love and hate for Microsoft Small Business Server

I’ve just completed a migration from Small Business Server 2003 to 2008. I’ve worked on and off with SBS since version 4.0, and have mixed feelings about the product. It has always been great value, but massive complexity lurks not far beneath its simple wizards.

The difficulty of migration is probably its worst feature: it chugs along for a few years gradually outgrowing its hardware, and then when the time comes for a new server customers are faced with either starting from scratch with a clean install – set up new accounts, import mailboxes, every client machine removed and rejoined to a new domain – or else a painful migration.

I took the latter route, and also decided to go virtual on Hyper-V Server 2008 R2. In most important respects it went smoothly: Active Directory behaved itself, and the Exchange mailboxes all came over cleanly.

Still, several things struck me during the migration. Microsoft has a handy 79-page step-by-step document, but anyone who thinks that carefully following the steps will guarantee success will be disappointed. There are always surprises. The document does not properly cover DHCP, for example. The migration is surprisingly messy in places. The new SBS has different sets of permissions than the old one, and after the upgrade you have to somehow merge the two. The migration is not fully automated, and there is plenty of manual editing of various settings.

Even migrating SBS 2008 to SBS 2008, for a new server, has brought forth a 58-page document from Microsoft.

Then there are the errors to deal with. There are always errors. You have to figure out which ones are significant and how to fix them. I would like to meet a windows admin who could look me in the eye and say they have no errors in their event log.

Things got bad when applying all the updates to bring the server up-to-date. At one point SharePoint broke completely and could not contact its configuration database.  There’s also the mystery of security update KB967723, which Windows Update installed insisting that it was “important,” and which then generated the following logged message 79 times in the space of a few seconds:

Windows Servicing identified that package KB967723(Security Update) is not applicable for this system

Nevertheless, a little tender care and attention got the system into reasonable shape. It is even smart enough to change Outlook settings to the new server automatically. A great feature of the migration is that email flow is never interrupted.

One problem: although running SBS virtual is a supported configuration, the built-in backup system doesn’t handle it well, because it assumes use of external USB drives which Hyper-V guests cannot access directly. There are many solutions, none perfect, and it appears that Microsoft did not think this one through.

That said, the virtual solution has some inherent advantages for backup and restore, the main one being that you can guarantee identical hardware for disaster recovery. If you shut the guests down and backup the host, or export the VM, you have a reliable system backup. You can also back up a running guest from the host, though in my experience this is more fragile.

Migrating an SBS system is actually harder than working with grown-up Windows systems on separate servers (or virtual servers) because it all has to be done together. I reckon Microsoft could do a better job with the tools; but it is a complex process with multiple potential points of failure.

The experience overall does nothing to shake my view that cloud-based services are the future. I would like to see SBS become a kind of smart cache for cloud storage and services, rather than being a local all-or-nothing box that can absorb large amounts of troubleshooting time. Microsoft is going to lose a lot of this SME business, because it has ploughed on with more of the same rather than helping its existing SBS customers to move on.

Nevertheless, if you have made the decision to run your own email and collaboration services, rather than being at the mercy of a hosted service, SBS 2008 does it all.

Migrating to Hyper-V 2008 R2

I have a test setup in my office which runs mostly on Hyper-V. It is a kind of home-brew small  business server, with Exchange, ISA and SharePoint all running on separate VMs. I’ve followed Microsoft’s advice and kept Active Directory on a separate physical server. Until today, Hyper-V itself was running on Server 2008.

I’m reviewing Hyper-V Server 2008 R2, so I figured it would be interesting to migrate the VMs. I attached an external USB drive, shut down the  VMs and exported them. Next, I verified that there was nothing else I needed to preserve on that machine, and set about installing Hyper-V Server 2008 R2 from scratch.

Aside: when I first set this up I broke the rules by having Active Directory on the Hyper-V host. That worked well enough in my small setup; but I realised that you lose some of the benefit of virtualisation if you have anything of value on the host, so I moved Active Directory to a separate box.

I wish I could tell you that the migration went smoothly. Actually, from the Hyper-V perspective it did go smoothly. However, I had an ordeal with my server, a cheapie HP ML110 G5. The driver for the embedded Adaptec Sata RAID did not work with Hyper-V Server 2008 R2, and I couldn’t find an update, so I disabled the RAID. The driver for my second network card also didn’t work, and I had to replace the card. Finally, my efforts at updating the BIOS had landed me with a known problem on this server: the fans staying at maximum speed and deafening volume. Fortunately I found this thread which gives a fix: installing upgraded firmware for HP’s Lights-Out Remote Management as well. Blissful (near) silence.

Once I’d got the operating system installed successfully, bringing the VMs back on line was a snap. I used the console menu to join the machine to the domain, set up remote management, and configure the network cards. Next, I copied the exported VMs to the new server, imported them using Hyper-V manager running on Windows 7, and shortly afterwards everything was up and running again. I did get a warning logged about the integration services being out-of-date, but this was easy to upgrade. I’m hoping to see some performance benefit, since my .vhd virtual drives are dynamic, and these are meant to be much faster in the R2 update.

Although I’m impressed with Hyper-V itself, some aspects of Hyper-V Server 2008 R2 are lacking. Mostly this is to do with Server Core. Shipping a cut-down Server OS without a GUI is a great idea in itself, but Microsoft either needs to make it easy to manage from the command line, or easy to hook up to remote tools. Neither is the case. If you want to manage Hyper-V from the command line you need this semi-official management library, which seems to be the personal project of technical evangelist James O’Neill. Great work, but you would have thought it would be built into the product.

As for remote tools, the tools themselves exist, but getting the permissions right is such an arcane process that another dedicated Microsoft individual, program manager John Howard, wrote a script to make it possible for humans. It is not so bad with domain-joined hosts like mine, but even then I’ve had strange errors. I haven’t managed to get device manager working remotely yet – “Access denied” – and sometimes I get a kerberos error “network path not found”.

Fortunately there’s only occasional need to access the host once it is up and running; it seems very stable and I doubt it will require much attention.

Sophos Windows 7 anti-virus test tells us nothing we don’t already know

Sophos is getting good publicity for its latest sales pitch virus test on Windows 7. This tells us:

We grabbed the next 10 unique samples that arrived in the SophosLabs feed to see how well the newer, more secure version of Windows and UAC held up. Unfortunately, despite Microsoft’s claims, Windows 7 disappointed just like earlier versions of Windows. The good news is that, of the freshest 10 samples that arrived, 2 would not operate correctly under Windows 7.

Unfortunately Chester Wisniewski from Sophos is vague about his methodology, though he does say that Windows 7 was set up in its default state and without anti-virus installed. The UAC setting was on its new default, which is less secure (and intrusive) than the default in Windows Vista.

My presumption is that he copied each virus to the machine and executed it – and was apparently disappointed (or more likely elated) to discover that 8 out of 10 examples infected the machine.

It might be more accurate to say that he infected the machine, when he copied the virus to it and executed it.

I am not sure what operating system would pass this test. What about a script, for example, that deleted all a user’s documents? UAC would not attempt to prevent that; users have the right do delete their own documents if they wish. Would that count as a failure?

Now, it may be that Wisniewski means that these executables successfully escalated their permissions. This means, for example, that they might have written to system locations which are meant to be protected unless the user passes the UAC prompt. That would count as some sort of failure – although Microsoft has never claimed that UAC will prevent it, particularly if the user is logged on with administrative rights.

If this were a serious study, we would be told what the results were if the user is logged on with standard user rights (Microsoft’s long-term goal), and what the results were if UAC is wound up to its highest level (which I recommend).

Even in that case, it would not surprise me if some of the malware succeeded in escalating its permissions and infecting system areas, though it would make a more interesting study. The better way to protect your machine is not to execute the malware in the first place. Unfortunately, social engineering means that even skilled users make mistakes; or sometimes a bug in the web browser enables a malicious web site to install malware (that would also be a more interesting study). Sometimes a user will even agree to elevate the malware’s rights – UAC cannot prevent that.

My point: the malware problem is too important to trivialise with this sort of headline-grabbing, meaningless test.

Nor do I believe the implicit message in Wisniewski’s post, that buying and installing Sophos will make a machine secure. Anti-virus software has by and large failed to protect us, though undoubtedly it will prevent some infections.

See also this earlier post about UAC and Windows security, which has links to some Microsoft statements about it.

Technorati Tags: ,,,

The cloud in education: Google Apps vs Live@Edu

I’ve been researching the use of cloud apps in education for a talk I am giving next week. I’m normally more business-focused, and it’s been interesting to uncover another area where Microsoft and Google are in hot competition. Both companies are happy to give educational institutions free cloud email and collaboration services; and the offer is being snapped up by colleges and universities hard-pressed for money and tired of fighting spam-clogged inboxes. 

Microsoft has first mover advantage here: Live@Edu has been around since March 2005 as a service based on hotmail, though its evolution into a fuller collaboration system is more recent, whereas Google Apps for Education did not appear until October 2006. They are both generous schemes – of course the providers want to get students hooked on their stuff – and as far as I can tell both are well liked.

What is interesting is to look at the points of differentiation, which show the contrasting approach of these two companies. Microsoft is pursuing its “software plus services” strategy, which means desktop applications still play an important role. The email is Exchange-based, so you can use other email clients, but only Outlook on Windows will deliver full features. Document collaboration is based primarily on cloud storage rather then editing, though when Office Web Apps appear next year users will have some lightweight editing tools.

Google on the other hand is primarily web based, with desktop support as an add-on. Google has the lead when it comes to online document editing, since it has had Google Docs for some time, whereas Office Web Apps are still in beta. Google has no bias towards Windows and Office. With Google, a document’s primary existence is in the cloud, although you can export and import with possible loss of data or formatting.

Something else I noticed is that Google has big plans for integration with mobile devices, whereas Microsoft seems mainly concerned with Exchange synchronisation.

Microsoft’s pitch is that if you live in Windows anyway, with Exchange and SharePoint on the server, and Windows and Office on the client, then its cloud service integrates nicely. Google on the other hand is more revolutionary, not caring about what you run as long as you can connect to its services.

Although the software plus services idea has attractions, it sounds more like a transitional strategy than one for the long term. Over time, as the web platform gets more powerful, and as rich internet applications take over from pure desktop applications, the services part will grow absolutely dominant.

Google is a cooler brand than Microsoft, which helps its case when students are asked which platform they prefer.

Has anyone tried both platforms? Or even just one of them? I’d be interested in hearing your comments.

Ubuntu Linux: the agony and the ecstasy

Just after writing a positive review of Ubuntu Karmic Koala I noticed this piece on The Register: Early adopters bloodied by Ubuntu’s Karmic Koala:

Blank and flickering screens, failure to recognize hard drives, defaulting to the old 2.6.28 Linux kernel, and failure to get encryption running are taking their toll, as early adopters turn to the web for answers and log fresh bug reports in Ubuntu forums.

Did I get it wrong? Should I be warning users away from an operating system and upgrade that will only bring them grief?

I doubt it, though I see both sides of this story. I’ve been there: hours spent trying to get Bluetooth working on the Toshiba laptop on which I’m typing; or persuading an Asus Eee PC to connect to my wi-fi; or running dpkg-reconfigure xserver-xorg to try to get Compiz working or to escape basic VGA; or running Super Grub to fix an Ubuntu PC that will not boot; or trying to fix a failed migration from Lilo to Grub 2 on my Ubuntu server.

That said, I noticed that the same laptop which gave me Ubuntu Bluetooth grief a couple of years ago now works fine with a clean install, Bluetooth included. It’s even possible that my own contribution helped – that’s how Linux works – though I doubt it in this case.

I also noticed how Ubuntu 9.10 has moved ahead of Windows in several areas. Here are three:

  1. Cloud storage and synchronization

    Microsoft has Live Mesh. Typical Microsoft: some great ideas, I suspect over-engineered, requires complex runtime to be downloaded and installed, not clear where it fits into Microsoft’s overall strategy, still in beta long after it was first trumpeted as a big new thing. So is this thing built into Windows 7? No way.

    By contrast Ubuntu turns up with what looks like a dead simple cloud storage and synchronization piece, web access, file system access, optional sharing, syncs files over multiple computers. Ubuntu One. I’ve not checked how it handles conflicts; but then Mesh was pretty poor at that too, last time I looked. All built-in to Karmic Koala, click, register, done.

  2. Multiple workspaces

    Apple and Linux have had this for years; I have no idea why it isn’t in Windows 7, or Vista for that matter. Incredibly useful – if the screen is busy but you don’t fancy closing all those windows, just switch to a new desktop.

  3. Application install

    This is so much better on Linux than on Windows or Mac; the only platform I know of that is equally user-friendly is the iPhone. OK, iPhone is better, because it has user ratings and so on; but Ubuntu is pretty good: Software Centre – browse – install.

I could go on. Shift-Alt-UpArrow, Ubuntu’s version of Exposé, very nice, not on Windows. And the fact that I can connect a file explorer over SSL using Places – Connect to server, where on Windows I have to download and install WinScp or the like.

Plus, let’s not forget that Ubuntu is free.

Of course you can make a case for Windows too. It’s more polished, it’s ubiquitous, app availability is beyond compare. It is a safe choice. I’m typing this on Ubuntu in BlogGTK but missing Windows Live Writer.

Still, Ubuntu is a fantastic deal, especially with Ubuntu One included. I don’t understand the economics by which Canonical can give everyone in the world 2GB of free cloud storage; if it is hoping that enough people will upgrade to the 50GB paid-for version that it will pay for the freeloaders, I fear it will be disappointed.

My point: overall, there is far more right than wrong with Ubuntu in general and Karmic Koala in particular; and I am still happy to recommend it.

Hyper-V Server 2008 R2: a great deal for Windows virtualization

Microsoft’s free Hyper-V Server 2008 R2 is a version of Windows Server Core dedicated to one function only: hosting virtual machines. Can you really get something worthwhile for nothing from Microsoft? The answer seems to be yes, especially when it is trying to win market share from well-established competitors. I’ve had test servers running on the earlier release of Hyper-V since Server 2008 first appeared, and it’s worked well.

Hyper-V R2 has a number of interesting new features including live migration. Another, less exciting but of great interest to folk such as myself who are constantly running trial software, is that dynamically expanding virtual hard drives now perform nearly as well as fixed-size virtual drives. Dynamic drives are far more convenient.

I downloaded Hyper-V 2008 and installed it on a spare machine. The main requirements are a processor that supports hardware virtualization (Intel VT or AMD-V) and hardware Data Execution Prevention (Intel’s Execute Disable Bit or AMD’s NX bit); note that these also have to be enabled in the BIOS.

Once it is up and running you are greeted with a couple of text windows, which feels sparse compared to the usual Windows GUI; but does provide a convenient menu for the things you are likely to want to do next. Actions include naming the computer, joining a domain, downloading updates, adding a local administrator and configuring remote desktop.

Working with Server Core does have some hassles. For example, many third-party drivers and tools come as setup executables that will not run without a GUI. The major vendors should have come to terms with this by now, but it can be a problem particularly with older hardware.

The next step (if you are on Windows 7) is to download and install the Remote server administration tools for Windows 7. Note that after installing, you have to go into Control Panel – Programs – Windows Features and enable the Remote Server Administration Tools, at least including the Hyper-V manager. Then you can run this from the Start menu and connect to your new server.

This step can be problematic. My first attempts failed with RPC permission errors, which I solved by joining the hyper-v server to the Windows domain. If that is not available or desired, there are other fixes.

Other remote admin tools can be useful too. For example, you can connect the Event Viewer to check out the logs.

Once Hyper-V manager is connected, you can create a new virtual machine with a few clicks. I downloaded the latest Ubuntu server iso, copied it to the Hyper-V server, and set it as the DVD drive for the new machine. Started it up, connected, and I was ready to go.

Hyper-V Server is not the only free virtualization platform. Let’s note that completely free platforms also exist – like, indeed, Ubuntu with KVM. I’d also note that VMware is a more mature and advanced platform, despite Hyper-V’s rapid progress.

Still, what you get with Hyper-V server is a polished and easy to use solution that integrates easily with Windows and Active Directory. This is a great deal.

Technorati Tags: ,,