Microsoft Hyper-V vs VMWare: is System Center the weak point?

The Register reports that Google now runs all its cloud apps in Docker-like containers; this is in line with what I heard at the QCon developer event earlier this year, where Docker was the hot topic. What caught my eye though was Trevor Pott’s comment comparing, not Hyper-V to VMWare, but System Center Virtual Machine Manager to VMWare’s management tools:

With VMware, I can go from "nothing at all" to "fully managed cluster with everything needed for a five nines private cloud setup" in well under an hour. With SCVMM it will take me over a week to get all the bugs knocked out, because even after you get the basics set up, there are an infinite number of stupid little nerd knobs and settings that need to be twiddled to make the goddamned thing actually usable.

VMWare guy struggling to learn a different way of doing things? There might be a little of that; but Pott makes a fair point (in another comment) about the difficulty, with Hyper-V, of isolating the hypervisor platform from the virtual machines it is hosting. For example, if your Hyper-V hosts are domain-joined, and your Active Directory (AD) servers are virtualised, and something goes wrong with AD, then you could have difficulty logging in to fix it. Pott is talking about a 15,000 node datacenter, but I have dealt with this problem at a micro level; setting up Windows to manage a non-domain joined host from a domain-joined client is challenging, even with the help of the scripts written by an enterprising Program Manager at Microsoft. Of course your enterprise AD setup should be so resilient that this cannot happen, but it is an awkward dependency.

Writing about enterprise computing is a challenge for journalists because of the difficulty of getting hands-on experience or objective insight from practitioners; vendors of course are only too willing to show off their stuff but inevitably they paint with a broad brush and with obvious self-interest. Much of IT is about the nitty-gritty. I do a little work with small businesses partly to get some kind of real-world perspective. Even the little I do is educational.

For example, recently I renewed the certificate used by a Microsoft Dynamics CRM installation. Renewing and installing the certificate was easy; but I neglected to set permissions on the private key so that the CRM service could access it, so it did not work. There was a similar step needed on the ADFS server (because this is an internet-facing deployment); it is not an intuitive process because the errors which surface in the event viewer often do not pinpoint the actual problem, but rather are a symptom of the problem. It does not help that the CRM Email Router, when things go wrong, logs an identical error event every few seconds, drowning out any other events.

In other words, I have shared some of the pain of sysadmins and know what Pott means by “stupid little nerd knobs”.

Getting back to the point, I have actually installed System Center including Virtual Machine Manager in my own lab, and it was challenging. System Center is actually a suite of products developed at different times and sometimes originating from different companies (Orchestrator, for example), and this shows in lack of consistency in the user interface, and in occasional confusing overlap in functionality.

I have a high regard for Hyper-V itself, having found it a solid and fast performer in my own use and an enormous advance over working with physical servers. The free management tool that you can install on Windows 7 or 8 is also rather good. The free Hyper-V server you can download from Microsoft is one of the best bargains in IT. Feature-wise, Hyper-V has improved rapidly with each new release and it seems to me a strong offering.

We have also seen from Microsoft’s own Azure cloud platform, which uses Hyper-V for virtualisation, that it is possible to automate provisioning and running Hyper-V at huge scale, controlled by easy to use management tools, either browser-based or using PowerShell scripts.

Talk private cloud though, and you are back with System Center with all its challenges and complexity.

Well, now you have the option of Azure Pack, which brings some of Azure’s technology (including its user-friendly portal) to enterprise or hosting provider datacenters. Microsoft needed to harmonise System Center with Azure; and the fact that it is replacing parts of System Center with what has been developed for Azure suggests recognition that it is much better; though no doubt installing and configuring Azure Pack also has challenges.

My last reflection on the above is that ease of use matters in enterprise IT just as it does in the consumer world. Yes, the users are specialists and willing to accept a certain amount of complexity; but if you have reliable tools with clearly documented steps and which help you to do things right, then there are fewer errors and greater productivity. 

Xamarin 3.0 brings iOS visual design to Visual Studio, cross-platform XAML, F#, NuGet and more

Xamarin has announced the third version of its cross-platform tools, which use C# and .NET to target multiple platforms, including iOS, Android and Mac OS X.

Xamarin 3.0 is a big release. In summary:

Xamarin Designer for iOS

Using a visual designer for iOS Storyboard projects, you can create and modify a GUI in both Visual Studio and Xamarin Studio (Xamarin’s own IDE). The designer uses the native Storyboard format, so you can open and modify existing files created in Xcode on the Mac. The technology here is amazing, since you iOS controls are rendered remotely on a Mac, and transmitted to the designer on Windows. See here for a quick hands-on.

Xamarin Forms

Xamarin has created the cross-platform GUI framework that it said it did not believe in. It is based on XAML though not compatible with Microsoft’s existing XAML implementations. There is no visual designer yet.

Why has Xamarin changed its mind? It was pressure from enterprise customers, from what I heard from CEO Nat Friedman. They want to make internal mobile apps with many forms, and do not want to rewrite the GUI code for every mobile platform they support.

Friedman made the point that Xamarin Forms still render as native controls. There is no drawing code in Xamarin Forms.

“The challenge for us in  building Xamarin forms was to give people enhanced productivity without compromising the native approach. The mix and match approach, where you can mix in native code at any point, you can get a handle for the native control, we’re think we’ve got the right compromise. And we’re not forcing Xamarin forms on you, this is just an option,”

he told me.

Again, there is a quick hands-on here.

F# support

F# is now officially supported in Xamarin projects. This brings functional programming to Xamarin, and will be warmly welcomed by the small but enthusiastic F# community (including, as I understand it, key .NET users in the financial world).

Portable Class Libraries

Xamarin now supports Microsoft’s Portable Class Libraries, which let you state what targets you want to support, and have Visual Studio ensure that you write compatible code. This also means that library vendors can easily support Xamarin if they choose to do so.

NuGet Packages

The NuGet package manager has transformed the business of getting hold of new libraries for use in Visual Studio. Now you can use it with Xamarin in both Visual Studio and Xamarin Studio.

Microsoft partnership

Perhaps the most interesting part of my interview with Nat Friedman was what he said about the company’s partnership with Microsoft. Apparently this is now close both from a technical perspective, and for business, with Microsoft inviting Xamarin for briefings with key customers.

Hands on with Xamarin 3.0: a cross-platform breakthrough for Visual Studio

Today Xamarin announced version 3.0 of its cross-platform mobile development tools, which let you target Android and iOS with C# and .NET. I have been trying a late beta preview.

In order to use Xamarin 3.0 with iOS support you do need a Mac. However, you can do essentially all of your development in Visual Studio, and just use the Mac for debugging.

To get started, I installed Xamarin 3.0 on both Windows (with Visual Studio 2013 installed) and on a Mac Mini on the same network.

image

Unfortunately I was not able to sit back and relax. I got an error installing Xamarin Studio, following which the installer would not proceed further. My solution was to download the full DMG (Mac virtual disk image) for Xamarin Studio and run that separately. This worked, and I was able to complete the install with the combined installer.

When you start a Visual Studio iOS project, you are prompted to pair with a Mac. To do this, you run a utility on the Mac called Xamarin.IOS Build Host, which generates a PIN. You enter the PIN in Visual Studio and then pairing is active.

image

Once paired, you can create or open iOS Storyboard projects in Visual Studio, and use Xamarin’s amazing visual designer.

image

Please click this image to open it full-size. What you are seeing is a native iOS Storyboard file open in Visual Studio 2013 and rendering the iOS controls. On the left is a palette of visual components I can add to the Storyboard. On the right is the normal Visual Studio solution explorer and property inspector.

The way this works, according to what Xamarin CEO Nat Friedman told me, is that the controls are rendered using the iOS simulator on the Mac, and then transmitted to the Windows designer. Thus, what you see is exactly what the simulator will render at runtime. Friedman says it is better than the Xcode designer.

“The way we do event handling is far more intuitive than Xcode. It supports the new iOS 7 auto-layout feature. It allows you to live preview custom controls. Instead of getting a grey rectangle you can see it live rendered inside the canvas. We use the iOS native format for Storyboard files so you can open existing Storyboard files and edit them.”

I made a trivial change to the project, configured the project to debug on the iOS simulator, and hit Start. On the Mac side, the app opened in the simulator. On the Windows side, I have breakpoint debugging.

image

Now, I will not pretend that everything ran smoothly in the short time I have had the preview. I have had problems with the pairing after switching projects in Visual Studio. I also had to quit and restart the iOS Simulator in order to get rendering working again. This is an amazing experience though, combining remote debugging with a visual designer on Visual Studio in Windows that remote-renders design-time controls.

Still, time to look at another key new feature in Xamarin 3: Xamarin Forms. This is none other than our old friend XAML, implemented for iOS and Android. The Mono team has some experience implementing XAML on Linux, thanks to the Moonlight project which did Silverlight on Linux, but this is rather different. Xamarin forms does not do any custom drawing, but wraps native controls. In other words, it like is the Eclipse SWT approach for Java, and not like the Swing approach which does its own drawing. This is keeping with Xamarin’s philosophy of keeping apps as native as possible, even though the very existence of a cross-platform GUI framework is something of a compromise.

I have not had long to play with this. I did create a new Xamarin Forms project, and copy a few lines of XAML from a sample into a shared XAML file. Note that Xamarin Forms uses Shared Projects in Visual Studio, the same approach used by Microsoft’s Universal Apps. However, Xamarin Forms apps are NOT Universal Apps, since they do not support Windows 8 (yet).

image 

In a Shared Project, you have some code that is shared, and other code that is target-specific. By default hardly any code is shared, but you can move code to the shared node, or create new items there. I created XamFormsExample.xaml in the shared node, and amended App.cs so that it loads automatically. Then I ran the project in the Android emulator.

image

I was also able to run this on iOS using the remote connection.

I noticed a few things about the XAML. The namespace is:

xmlns="http://xamarin.com/schemas/2014/forms"
xmlns:x="http://schemas.microsoft.com/winfx/2009/xaml"

I have not seen this before. Microsoft’s XAML always seems to have a “2006” namespace. For example, this is for a Universal App:

xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
xmlns:x=http://schemas.microsoft.com/winfx/2006/xaml

However, XAML 2009 does exist and apparently can be used in limited circumstances:

In WPF, you can use XAML 2009 features, but only for XAML that is not WPF markup-compiled. Markup-compiled XAML and the BAML form of XAML do not currently support the XAML 2009 language keywords and features.

It’s odd, because of course Xamarin’s XAML is cut-down compared to Microsoft’s XAML. That said, I am not sure of the exact specification of XAML in Xamarin Forms. I have a draft reference but it is incomplete. I am not sure that styles are supported, which would be a major omission. However you do get layout managers including AbsoluteLayout, Grid, RelativeLayout and StackLayout. You also get controls (called Views) including Button, DatePicker, Editor, Entry (single line editor), Image, Label, ListView, OpenGLView, ProgressBar, SearchBar, Slider, TableView and WebView.

Xamarin is not making any claims for compatibility in its XAML implementation. There is no visual designer, and you cannot port from existing XAML code. The commitment to wrapping native controls may limit prospects for compatibility. However, Friedman did say that Xamarin hopes to support Universal Apps, ie. to run on Windows 8 as well as Windows Phone, iOS and Android. He said:

I think it is the right strategy, and if it does take off, which I think it will, we will support it.

Friedman says the partnership with Microsoft (which begin in November 2013) is now close, and it would be reasonable to assume that greater compatibility with Microsoft XAML is a future goal. Note that Xamarin 3 also supports Portable Class Libraries, so on the non-visual side sharing code with Microsoft projects should be straightforward.

Personally I think both the Xamarin forms and the iOS visual designer (which, note, does NOT support Xamarin Forms) are significant features. The iOS designer matters because you can now do almost all of your cross-platform mobile development within Visual Studio, even if you want to follow the old Xamarin model of a different, native user interface for each platform; and Xamarin Forms because it enables a new level of code sharing for Xamarin projects, as well as making XAML into a GUI language that you can use across all the most popular platforms. Note that I do have reservations about XAML; but it does tick the boxes for scaling to multiple form factors and for enormous flexibility.

Event report: Sony demonstrates the high-res audio HAP-Z1ES player at the Audio Lounge in London

I went along to the Audio Lounge in London to hear Sony’s Eric Kingdon (Senior European Technical Marketing Manager) and Mike Somerset (Product Marketing Manager) talk about high resolution audio and demonstrate the HAP-Z1ES player.

image

The HAP-Z1ES costs £1,999 and plays both DSD (the format of SACD) and PCM formats, including DSDIFF,DSF,WAV,FLAC,ALAC,MP3 and ATRAC. PCM is up to 24-bit/192kHz and DSD up to double DSD (DSD 128). It was demonstrated with the Sony TA-A1E amplifier (also £1999) and the Crystal Cable Arabesque Mini loudspeakers which costs €12,999 (not sure of £ price) including the stands.

image

This was a small event for customers and there were around 20 attending. Ruth Phypers at the Audio Lounge gave us a warm welcome and conveyed nothing other than enthusiasm for audio; no high-pressure sales here. The talk and demonstration took place in the basement listening room.

image

High resolution audio is controversial, in that there is evidence that even CD quality (16-bit/44.1 kHz) is good enough to capture everything we can hear in normal music played at normal levels – see Monty Montgomery’s excellent technical explanation and accompanying videos for why – and I was interested to see how Sony is pitching high-res to its potential customers. I was also interested to see if it would broach the tricky subject of DSD vs PCM and whether there is any audible difference.

In this respect it was a curious event as you will see. One of the odd things was that little music was played, maybe 10 minutes out of a one and a half hour presentation.

Somerset kicked things off, explaining the battle between convenience and quality in music reproduction. “We’ve lost a lot in quality” he said, thanks to the popularity of MP3. So what does Sony mean by high-res? Anything beyond CD quality, he said, confusing the issue: is it MP3 that is limiting audio quality today, or CD?

“A lot of people out there think CD, that’s as good as it gets, nothing better, obviously we know that’s not true,” he said.

That said, he made the point that the Z1ES is not just designed for high-res, but to perform well with most formats and resolution. It has a DSEE (Digital Sound Enhancement Engine) which supposedly improves the sound of lossy-compressed audio by “improving the spectrum” (according to the slide; I still have no idea what this means); and a DSD remastering engine that converts lossless PCM to double DSD on the fly (the PCM file remains as-is and it is not stored twice).

Why would you want to do that? I asked Kingdon later who said it was a matter of personal taste; you should take it home and try it. Personally I’m not sure why it should make any difference at all to the sound; you would have thought it would be audibly transparent if the double DSD encoding is doing its job, and if it does sound different it raises the question of whether the DSD conversion ends up colouring the sound; unless perhaps the DAC is more capable with DSD than with PCM. On this latter point Kingdon said no; the Burr-Brown DAC is excellent for PCM. DSD remastering is optional and you can easily enable or disable the feature.

Somerset also explained that the Z1ES does not stream music; it copies audio files to its own internal storage (1TB hard drive). However it can detect when music is added to a network location such as a NAS (Network Attached Storage) drive and copy it automatically. The reason it is copied and not streamed is to eliminate network latency, he said. If 1TB is not enough, you can attach a USB external drive, but this must be reformatted to Ext4 by the system, deleting any existing files.

The Ext4 limitation was a matter of some discussion and discontent among the audience. The Z1ES runs Linux internally, hence the requirement for Ext4, but Linux can mount other file formats successfully so a future firmware update will likely remove this limitation.

Kingdon then answered questions – would the unit go out of date quickly? No, it will have a long life, he promised. Why no video output? “It’s a pure audio product,” he said.

Eventually we got to a demo. Somerset kicked off by playing a Bob Dylan track, Blowing in the Wind (recorded in 1963) in three different formats. The first was 24-bit 88.2 kHz flac (I imagine derived from the DSD used for the SACD release, as conversions from SACD often end up as 24/88). The second was 256kpbs MP3. Finally, there was what he described as a “heavily compressed” MP3, though the exact resolution was not specified. All were derived from the same original source, we were told.

“For me, focusing on the vocals, you can really hear the difference in brightness,” said Somerset.

The odd thing was that (to my ears) the 24/88 version did indeed sound brighter and slightly louder than the MP3, which I find puzzling. I’m not aware of any technical reason why high resolution audio should sound any brighter (or tonally different) from CD or MP3. There was not a dramatic difference in overall quality from what little I could tell in the few seconds of music we heard, but I was not sure that the brighter sound was an improvement; Dylan can sound a little strident at times and the slightly mellower (and dare I say, more analogue-sounding) MP3 version could well be preferred.

We switched back and forth a couple of times, and then Somerset played the “heavily compressed” version. This sounded OK too, from what I could hear of it, which might explain why Somerset talked over it and stopped playing it quickly, saying how bad it was.

Next we heard a DSD download from Blue Coast records; it was Immediately Blessed by Keith Greeninger. This sounded superb, far better than the Dylan, though I doubt this was much to do with formats, but more because it was a modern recording made by a dedicated audiophile label. It was the best sound we heard.

Daft Punk followed, at 24/88.2, and then a 24/96 Linda Ronstadt track from 1983, and then a Nat King Cole song from 1957 in 16/44.1 format.

That was it for demos, if I remember right. What was notable to me was that Sony never demonstrated high-res vs CD quality, played only one DSD track, and used mostly older recordings. Some of these older recordings do indeed sound great, but I doubt it is the best way to demonstrate high resolution audio. If you attended the session as a high-res sceptic you would have heard nothing to change your mind.

Another odd thing was that we heard tracks there were available on SACD but played to us as PCM, most likely converted from the SACD source. Why did we not hear the DSD? It is probably do to with the difficulty all of us have in ripping SACD to audio files, which can only done (as far as I am aware) with a hacked PlayStation 3 with old firmware.

I asked Kingdon why Sony does not make its high-res products like the Z1ES more attractive by giving us the ability to rip SACD at best quality? The record companies would not like it, he said. “I’ve had this discussion so many times, I’ve got a big SACD collection, some of it isn’t available any more, I’m sorry, I don’t have an answer for you.”

Despite some frustration at the brevity and content of the demos, this was an enjoyable event with great hospitality from the Audio Lounge, some fascinating recollections from Kingdon of his time with Sony over many years, and a high level of warmth and friendliness all round.

Now if I were Sony I would use the best possible sources to show off high-res audio and the new player, and avoid misleading comparisons or doubtful technical statements. The fact is that many high-res sources, whether SACD, DVD Audio (which you can easily rip to a player like this) or downloads, do sound excellent, and for many that is more than enough to justify purchase.

Would a beautifully mastered CD or CD-quality download sound just as good? Possibly, and the fact that Sony did not attempt to demonstrate the difference, but compared high-res to MP3, lends support to the idea. If there really is a big difference, why not demonstrate it?

As for the Z1ES itself, I heard enough to know that it can sound very good indeed. It is disappointing that it has no surround sound capability, and no digital input so you could use it as an external DAC, but those are not show-stoppers. For myself I would be more inclined to invest in a standalone DAC, maybe one which is both DSD and PCM capable, but if you like simplicity, then a machine with its own storage, DAC, remote, and handy screen for album artwork does make sense.

Fixing a low-tech computer attack by fake “Microsoft”

For the second time this week, I wasted some time fixing an infected Windows PC. The intriguing aspect of this infection though is that it was not really a virus – unless you count crude scripts designed to scare and inconvenience the user.

The problem started when an elderly friend was called, so she thought, by Microsoft. It was not Microsoft at all, but a fraudster from, it appears, India. He explained that there was a problem with her PC and offered to fix it. I am not sure of all the details, but she ended up paying £20 (after negotiating down from a higher figure) to a bank account in Calcutta.

While this does not sound like something any sane person would do, no doubt these people are suitably convincing after years of practice. It is also true that Microsoft has support staff in India though note that the real company NEVER rings out of the blue with a virus warning so if this happens to you, it is a scam.

I found some payment forms on her PC. They include all the right logos.

image

The criminal got her to install TeamViewer and I found an entertaining batch file which perhaps he ran to simulate a security product. Here is part of it:

echo license key received
start /w wscript.exe C:sleep2000.vbs
echo:
echo:
echo:
echo Windows License is activated for Lifetime.
start /w wscript.exe C:sleep2000.vbs

and concludes:

echo Your license key has been succesfully activated in your computer..
echo Now computer is protected from hackers.

She thought that was the end of it, until she restarted her PC. First, she was prompted to run an executable called AA_v3.exe. If she cancelled, she got a message:

You have been hit by a stuxnet virus, you may lose all your files and folders

and then:

image

and

image

This is a simple .VBS script that displays message boxes in a loop.

Next, the computer shuts down. Why? Because the “stuxnet” message was a command in her startup folder that looks like this:

%windir%\system32\shutdown.exe -s -t 120 -c "You have been hit by a stuxnet virus, you may lose all your files and folders"

This runs before the other messages so you end up with a scary command prompt, more scary messages, and then your PC shuts down.

I am not sure what happens if you DO run AA_v3.exe. This, it turns out, is free remote control software called Ammyy Admin. This is so often used by scammers that there is a warning about it on the vendor’s web site:

!!! If you receive a phone call claiming to be from ‘Microsoft’ or someone claiming to work on their behalf, telling you that you have a virus on your computer or some errors which they will help you to fix via Ammyy Admin, it is definitely a scam.

Of course victims will not see this warning.

If you run it though, maybe the criminal can connect and cancel the shutdown before two minutes is up, and use the PC in a botnet. Or maybe there is a follow-up call demanding more money to fix the problem. Who knows?

The attraction of these low-tech scripts (for the fraudsters) is that anti-virus software will not detect anything amiss – though in fact, Ammyy Admin is so widely used for criminal purposes that 10 out of 50 anti-virus products used by Virustotal do report it as a “risky” executable.

image

The fix in this case was to log on using a different user profile – Safe mode would also have worked but I was working remotely. Once logged on I was able to remove the startup entries and run some other malware checking tools; ideally you would reinstall Windows but this is inconvenient for a home user.

The problem as ever is that if you know criminals have had use of a machine, you do not know what else they may have done.

This scam still seems to be common and profitable for the fraudsters, and will continue I imagine, unless both source and target countries make a real effort to find and prosecute those responsible.

Google, Bing: time to junk these parasitic download sites

“Users of today’s PCs live on a precipice. One false click and the adware and malware invades,” I remarked in a recent comment on Microsoft’s Surface Pro 3 launch.

The remark was prompted by a recent call from a friend. His PC was playing up. He was getting all sort of security warnings and being prompted to download more and more apps supposedly to fix problems. It all started, he said, when he went to Google to install iTunes.

After the clean-up, I wondered what had happened. I went to Google and typed in iTunes.

image

The top hit is Apple, which perhaps to prevent this kind of problem has actually paid for an ad on its own brand name. However my friend, understandably, went for the link that said iTunes Free Download (actually I am not sure if this was the exact link he clicked, but it was one like it).

Note how the ads are distinguished from the organic hits only by a small yellow indicator.

Microsoft’s Bing, incidentally, is even worse; I presume because Apple has not paid for an ad:

image

Using a secure virtual machine, I investigated what happens if you click one of these links (I advise you NOT to try this on your normal PC). I clicked the Google one, which took me to SOFTNOW.

image

I hit the big Download button.

image

It is downloading a setup from drive-files-b.com which claims to be iTunes, but it is not, as we will see.

The file passes Microsoft’s security scan and runs. The setup is signed by Perion Network Ltd.

image

Now here comes iTunes – or does it?

image

I clicked to see the Terms of Service. These are from Perion, not Apple, and explain that I am going to get an alternative search service for my browser plus other utilities, on an opt-out basis.

image

However I doubt my friend clicked to see these. Probably he hit Next.

image

Apparently I have “elected to download Search Protect”. There are more terms to agree. The Skip and Skip All buttons are in grey; in fact, the Skip button looks disabled though perhaps it is not.

image

Now here comes a thing called Wajam which is going to recommend stuff to me.

image

And another horror called WebSteroids with more terms of use:

image

I am going to get “display ads (banner ads), text ads, in-text ads, interstitial ads, pop up ads, pop under ads, or other types of ads. Users may see additional ads when using their internet browser or other software”.

Thanks.

Now “iTunes” seems to be downloading.

image

Once it downloads, I get an Install Now button. Apparently all those Next buttons I clicked did not install iTunes after all.

image

This last button, of course, downloads the real setup from Apple and runs it. Unfortunately it is the wrong version.

image

Who is to blame for all this? Well, the warning signs may be obvious to those of us in the trade, but frankly it is not that unreasonable to go to your trusted search engine, type in iTunes, and click the download link.

The blame is with Google (and Bing) for taking money from these advertisers whose aim is to get to you download their intrusive ad-laden extras.

Apple iTunes is free software and you can get it from Apple here.

Note that Google is experimenting with removing the address bar altogether, so you can only navigate the web by searching Google (which is what people do anyway). This would make users even more dependent on the search providers to do the right thing, which as you can see from the above, is not something you can count on.

On Microsoft Surface: premium hardware, declining vision

image
Microsoft’s Panos Panay shows off Surface Pro 3

Microsoft’s Surface Pro 3 was launched yesterday, but the roots of Microsoft’s Surface project – the company’s first own-brand PC – go back a long way. There are three big issues which it attempts to tackle:

1. The PC OEM hardware ecosystem was (and to a large extent still is) stuck in a vicious loop of a price-sensitive market driving down prices and forcing vendors to skimp on design and materials, and to pre-install unwanted third-party applications that damage user experience. Most high-end users bought Macs instead. With Surface Microsoft breaks out of the loop with premium design and zero unwanted add-ons.

2. The tablet market. Windows 8 is designed for touch, at least in its “Metro” personality. But desktop apps need a keyboard and mouse. How do you combine the two without creating a twisty monster? Surface with its fold-back, tear-off keyboard cover is an elegant solution.

3. Fixing Windows. Users of today’s PCs live on a precipice. One false click and the adware and malware invades. Live in the “Metro” environment, or use an iPad, and that is unlikely to happen. Use Windows RT (Windows on ARM) and it is even less likely, since most malware cannot install.

Surface could not have happened without Windows 8. The efforts to make it work as a tablet would make no sense.

Now we have Surface 3. How is Microsoft doing?

I have followed Surface closely since its launch in September 2012. The models I know best are the original Surface RT, the second Surface RT called Surface 2, and the original Surface Pro, which is my machine of choice when travelling. A few observations.

There is plenty that I like (otherwise I would not use it so much). It really is slim and compact, and I would hate to go back to carrying a laptop everywhere. It is well-made and fairly robust, though the hinge on the keyboard covers is a weak point where the fabric can come unglued. The kickstand is handy, and one of my favourite configurations is Surface on its kickstand plus Bluetooth keyboard and mouse, with which I can be almost as productive as with a desktop (I do miss dual displays). I can also use the Surface successfully on my lap. In cramped aircraft seats it is not great but better than a laptop.

There are also annoyances. Only one USB port is a severe limitation and seems unnecessary, since there is room along the edge. For example, you plug in an external drive, now you cannot attach your camera. Not being able to upgrade the internal SSD is annoying, though I suppose inherent to the sealed design. Performance was poor on the original Surface RT, though Surface 2 is fine.

More annoying are the bugs. Sometimes the keyboard cover stops working; detaching and re-attaching usually but not always fixes it. Sometimes the wifi plays up and you have to disable and re-enable the wifi adapter in device manager. Another problem is power management, especially on Surface Pro (I gather that Pro 2 is better). You press power and it does not resume; or worse, you put it into your bag after pressing power off (which sends it to sleep), only to find later that it is heating your bag and wasting precious battery.

The key point here is this: Microsoft intended to make an appliance-like PC that, because of the synergy between first-party hardware and software, would be easy to maintain. It did not succeed, and even Surface RT is more troublesome to maintain than an iPad or Android tablet.

Microsoft also ran into user acceptance problems with Windows RT. Personally I like RT, I think I understand what Microsoft is (or was) trying to achieve, and with Surface specifically, I love the long battery life and easier (though this imperfect) maintenance that it offers. However, the apps are lacking, and Microsoft has so far failed to establish Windows as a tablet operating system like iOS and Android. People buy Windows to run Windows apps, they make little use of the Metro side, and for the most part Surface customers are those who would otherwise have bought laptops.

Incidentally, I have seen Surface RT used with success as a fool-proof portable machine for running Office and feel it deserved to do better, but the reality is that Microsoft has not persuaded the general public of its merits.

Another issue with Surface is the price. Given most Surface customers want the keyboard cover, which is integral to the concept, the cost is more than most laptops. But was Microsoft going for the premium market, or trying to compete with mass-market tablets? In reality, Surface is too expensive for the mass-market which is why its best success has been amongst high-end Windows users.

Surface Pro 3 and the launch that wasn’t

That brings me to Surface Pro 3. The intriguing aspect of yesterday’s launch is that it was rumoured to be for a new mini-sized Surface probably running Windows RT. Why else was the invite (which someone posted on Twitter) for a “small gathering”?

image

Admittedly, it is a stretch to suppose that the Surface Mini was cancelled between the date the invitations were sent out (around four weeks ago I believe) and the date of the event. On the other hand, this is a time of change at Microsoft. The Nokia acquisition completed on  25th April, putting former Nokia CEO Stephen Elop in charge of devices. Microsoft CEO Satya Nadella has only been in place since February 4. While cancelling a major hardware launch at such short notice would be surprising, it is not quite impossible, and a report from Bloomberg to that effect seems plausible.

It is also well-known that Microsoft does not intend to continue with three mobile operating systems: Windows x86, Windows on ARM, and Windows Phone. Windows Phone and Windows RT will “merge”, though merge may just mean that one will be scrapped, and that it will not be Windows Phone.

The promised arrival of a touch-friendly Microsoft Office for Windows Phone and Windows 8 further will rob Windows RT of a key distinctive feature.

This does not mean that Microsoft will not complete in the growing market for small tablets. It means, rather, that a future small tablet from Microsoft will run the Windows Phone OS – which is what some of us thought Microsoft should have done in the first place. This is a company that sometimes takes the hardest (and most expensive) possible route to its destination – see also Xbox One.

Surface Pro 3 specs: a MacBook Air compete

Surface Pro 3 is a large-size Surface Pro. It has a 12 inch 2160×1440 screen, a pen, and a redesigned keyboard cover that has an additional magnetic strip which sticks to the tablet when used laptop-style, for greater stability.

The kickstand can now be used at any angle, supposedly without slipping.

image

The weight is 800g making it lighter than a MacBook Air.

image

though note that the MacBook Air has a keyboard built in.

Battery life is quoted as “up to 9 hours”. There is still only one USB port. Full specs are here.

The Surface Pro 3 looks like a nice device. In the UK it starts at £639 for an Intel i3 device with a tiny 64GB SSD (I am running out of space with 128GB). And don’t forget the cover which will be at least £110 on top (prices include VAT).

A sensible Core i5 with 256GB SSD and a Type 2 cover will be around £1200. Not a bad buy; though personally I am not sure about the larger size.

Note that Microsoft has now abandoned the 16:9 wide-screen format which characterised the original release of Windows 8, designed to work well with two apps side by side. Surface Pro 3 has a conventional 3:2 screen ration.

Declining vision

Microsoft’s Surface project had a bold vision to reinvent Windows hardware and to usher in a new, more secure era of Windows computing, where tablet apps worked in harmony with the classic desktop.

It was bold but it failed. A combination of flawed implementation, patchy distribution, high prices, and above all, lack of success in the Windows Store ecosystem, meant that Surface remained at ground level.

What we have now is, by all accounts, an attractive high-end Windows hybrid. Not a bad thing in itself, but far short of what was originally hoped.

Microsoft is moving on, building on its investment in Active Directory, Azure cloud, and Microsoft Office, to base its business on an any-device strategy. The market has forced its hand, but it is embracing this new world and (to my mind) looks like making a success of it. It does not depend on the success of Surface, so whether or not the company ends up with a flourishing PC business is now almost incidental.

Microsoft Small Business Server to Server Essentials R2: not a smooth transition

Recently I assisted a small business (of around 10 users) with a transition from Small Business Server 2003 to Server Essentials R2.

Small Business Server 2003 had served it well for nearly 10 years. The package includes Windows Server 2003 (based on XP), Exchange, and the rather good firewall and proxy server ISA Server 2004 (the first release had ISA 2000, but you could upgrade).

image

SBS 2003 actually still does more than enough for this particular business, but it is heading for end of support, and there are some annoyances like Outlook 2013 not working with Exchange 2003. This last problem had already been solved, in this case, by a migration to Office 365 for email. No problem then: simply migrate SBS 2003 to the latest Server 2012 Essentials R2 and everything can continue running sweetly, I thought.

Sever Essentials is an edition designed for up to 25 users / 50 devices and is rather a bargain, since it is cheap and no CALs are required. In the R2 version matters are confused by the existence of a Server Essentials role which lets you install the simplified Essentials dashboard in any edition of Windows Server 2012. The advantage is that you can add as many users as you like; the snag is that you then need CALs in the normal way, so it is substantially more expensive.

Despite the move to Office 365, an on-premise server is still useful in many cases, for example for assigning permissions to network shares. This is also the primary reason for migrating Active Directory, rather than simply dumping the old server and recreating all the users.

The task then was to install Server Essentials 2012 R2, migrate Active Directory to the new server, and remove the old server. An all-Microsoft scenario using products designed for this kind of set-up, should be easy right?

Well, the documentation starts here. The section in TechNet covers both Server 2012 Essentials and the R2 edition, though if you drill down, some of the individual articles apply to one or the other. If you click the post promisingly entitled Migrate from Windows SBS 2003, you notice that it does not list Essentials R2 in the “applies to” list, only the first version, and there is no equivalent for R2.

Hmm, but is it similar? It turns out, not very. The original Server 2012 Essentials has a migration mode and a Migration Preparation Tool which you run on the old server (it seems to run adprep judging by the description, which updates Active Directory in preparation for migration). There is no migration tool nor migration mode in Server 2012 Essentials R2.

So which document does apply? The closest I could find was a general section on Migrate from Previous Versions to Windows Server 2012 R2 Essentials. This says to install Server 2012 Essentials R2 as a replica domain controller. How do you do that?

To install Windows Essentials as a replica Windows Server 2012 R2 domain controller in an existing domain as global catalog, follow instructions in Install a Replica Windows Server 2012 Domain Controller in an Existing Domain (Level 200).

Note the “Level 200” sneaked in there! The article in question is a general technical article for Server 2012 (though in this case equally applicable to R2) aimed at large organisations and full of information that is irrelevant to a tiny 10-user setup, as well as being technically more demanding that you would expect for a small business setup.

Fortunately I know my way around Active Directory to some extent, so I proceeded. Note you have to install the Active Directory role before you can run the relevant PowerShell cmdlets. Of course it did not work though. I got an error message “Unable to perform Exchange Schema Conflict Check.”

This message appears to relate to Exchange, but I think this is incidental. It just happens to be the first check that does not work. I think it was a WMI (Windows Management Instrumentation) issue,  I did not realise this at first though.

I should mention that although the earlier paper on migrating to Server Essentials 2012 is obsolete, it is the only official documentation that describes some of the things you need to do on the source server before you migrate. These include changing the configuration of the internet connection to bypass ISA Server (single network card configuration), which you do by running the Internet Connection Wizard. You should also check that Active Directory is in good health with dcdiag.exe.

I now did some further work. I removed ISA Server completely, and removed Exchange completely (note you need your SBS 2003 install CD for this). Removing ISA broke the Windows Server 2003 built-in firewall but I decided not worry about it. Following a tip I found, I also used ntdsutil to change the DSRM (Directory Services Recovery Mode) password. I also upgraded the SBS AD forest to Server 2003 (it was on Server 2000), which is necessary for migration to work.

I am not sure which step did the trick, but eventually I persuaded the PowerShell for creating the Replica Domain Controller to work. Then I was able to transfer the FSMO roles. I was relieved; I gather from reading around that some have abandoned the attempt to go from AD in Server 2003 to AD in Server 2012, and used an intermediate Server 2008 step as a workaround – more hassle.

After that things went relatively smoothly, but not without annoyances. There are a couple to mention. One is that after migrating the server, you are meant to connect the client computers by visiting a special URL on the server:

Browse to http://destination-servername/connect and install the Windows Server Connector software as if this was a new computer. The installation process is the same for domain-joined or non-domain-joined client computers.

If you do that from a client computer that was previously joined to the SBS domain (having removed unwanted stuff like the SBS 2003 client and ISA client) then you are prompted to download and run a utility to join the new network. You do that, and it says you cannot proceed because a computer of the same name already exists. But this is that same computer! No matter, the wizard will not run, though the computer is in fact already joined to the domain.

If you want to run the connect wizard and set up the Essentials features like client computer backup and anywhere access, then as far as I can tell this is the official way:

  • Make sure you have an admin user and password for the PC itself (not a domain user).
  • Demote the computer from the domain and join it to a workgroup. Make sure the computer is fully removed from the domain.
  • Then go to the connect URL and join it back.

If you are lucky, the domain user profile will magically reappear with all the old desktop icons, My Documents and so on. If you are unlucky you may need manual steps to recover it, or to use profile migration tools.

This is just lazy on Microsoft’s part. It has not bothered to create a tool that will do what is necessary to migrate an existing client computer into the Server Essentials experience (unless such a tool exists and I did not find it; I have seen reports of regedit hacks).

The second annoyance was with the Anywhere Access wizard. This is for enabling users to log in over the internet and access limited server features, and connect to their client desktop. I ran the wizard, installed a valid certificate, used a valid DNS name, manually opened port 443 on the external firewall, but still got verification errors.

image

Clicking Repair is no help. However, Anywhere Access works fine. I captured this screenshot from a remote session:

image

All of the above is normal business for Microsoft partners, but does illustrate why small businesses that take on this kind of task without partner assistance may well run into difficulties.

Looking at the sloppy documentation and missing pieces I do get the impression that Microsoft cares little about the numerous small businesses trundling away on old versions of SBS, but which now need to migrate. Why should it, one might observe, considering how little it charges for SBS 2012 Essentials? It is a fair point; but I would argue that looking after the small guys pays off, since some grow into big businesses, and even those that do not form a large business sector in aggregate. Google Apps, one suspects, is easier.

An underlying issue, as ever with SBS, is that Windows Server and in particular Active Directory is designed for large scale setups, and while SBS attempts to disguise the complexity, it is all there underneath and cannot always be ignored.

In mitigation, I have to say that for businesses like the one described above SBS has done a solid job with relatively little attention over many years, which is why it is worth some pain in installation.

Update: A couple of further observations and tips.

Concerning remote access, I suspect the wizard wants to see port 80 open and directed to the server. However this is not necessary as far as I can tell. It is also worth noting that SBS Essentials R2 installs TS Gateway, which means you can configure RDP direct to the server desktop (rather than to the limited dashboard you get via the Anywhere Access site).

The documentation, such as it is, suggests that you use the router for DHCP. Personally I prefer to have this on the server, and it also saves time and avoids errors since you can import the DHCP configuration to the new server.

Hands on with Cordova in Visual Studio

At TechEd this week, Microsoft announced Apache Cordova support in Visual Studio 2013. A Cordova app is HTML and JavaScript wrapped as a native app, with support for multiple platforms including iOS and Android. It is the open source part of Adobe’s PhoneGap product. I downloaded the preview from here and took a quick look.

There is a long list of dependencies which the preview offers to install on your behalf:

image

and

image

The list includes the Java SDK, Google Chrome and Apple iTunes. The documentation explains that Java is required for the Android build process, Chrome is required to run the Ripple emulator (so you could choose not to install if you do not require Ripple), and iTunes is required for deploying an app to an iOS device, though a Mac is also required.

The license terms for both Chrome and iTunes are long and onerous, plus iTunes is on my list of applications not to install on Windows if you want it to run fast. Chrome is already installed on my PC, and I unchecked iTunes.

Next, I ran Visual Studio and selected a Multi-Device Hybrid App project (I guess “Cordova app” was rejected as being too short and simple).

image

An annoyance is that if you use the default project location, it is incompatible because of spaces in the path:

image

The project opened, and being impatient I immediately hit Run.

When you build, and debug using the default Ripple emulator (which runs in Chrome, hence the dependency), Visual Studio grabs a ton of dependencies.

image

and eventually the app runs:

image

or you can debug in the Android emulator:

image

A good start.

Microsoft has some sample projects for AngularJS, BackboneJS and WinJS. This last is intriguing since you could emulate the Windows Phone look and feel (or something like it) on Android on iOS, though it would look far from native.

The preview is not feature-complete. The only supported device targets are Android 4.x, IOS 6 and 7, Windows 8.x Store apps, and Windows Phone 8.x. Windows Phone debugging does not work in this preview.

Review: Velodyne vFree wireless headphones

Last month I took a look at Velodyne’s vLeve headphones. Now it is time to look at the similarly-styled vFree, a wireless model which sits a bit higher in the Velodyne range. The range, incidentally, looks like this, though the prices (taken from the Velodyne site) are the most you can pay and you will likely do quite a bit better.

  • vLeve on-ear headphones $199
  • vFree on-ear Bluetooth headphones $299
  • vQuiet over-ear noise cancelling Headphones $299
  • vBold over-ear Bluetooth headphones $349
  • vTrue Studio over-ear headphones $399

image

In the smart glossy box you get the headphones, soft bag, cables for USB charging and for a wired audio connection (no microphone when wired), and a leaflet with a guide to pairing.

image

There is no charger; you can use the USB port on a PC or one of the many USB chargers you likely have already.

image

The headphones fold for portability and have various ports, LEDs and buttons. Under the left cup, you will find a battery status LED, the USB charging port, and a socket for wired audio.

The right cup sports most of the functions. There is volume up/down on the edge, and an LED function indicator and microphone on the bottom. The back of the cup is split to form three large buttons, one for Next/Previous, one for power and pairing, and one for play/pause/answer/end call.

Controls on on-ear devices always tend to be awkward, because you cannot see what you are doing. I like the generous size of the vFree controls, but finding which button to press is still tricky at first.

Pairing is a matter of holding down power until the device enters pairing mode. Fairly straightforward, though in my experience some devices pair more easily than others, and there is no clarity about whether you can pair to multiple devices. I believe you can, because the Nexus was able to reconnect after I paired to the Surface, but I had trouble the other way around and had to re-pair. The function LED is rather dim and hard to see.

The sound

You could say that the vFree is “like the vLeve but wireless”, but although the looks are similar they do not sound the same. Further, bear in mind that a wireless headset contains its own amplifier whereas with a wired set you are dependent on whatever comes with your device.

Perhaps for this reason, I found that the vFree sounded substantially different in the various configurations I tried. The best sound I got was when wired (there is a wired option using the supplied cable) and using a dedicated headphone amplifier: this gave rich bass, clean, clear and spacious sound.

Yet on the two mobile devices I tried, a Surface tablet and a Google Nexus tablet, the vFree sounded better wireless than wired. In fact, the wired option sounded bad in comparison, thinner and slightly distorted.

There is a logic behind this. Mobile devices often have poor audio amplifiers, and when you listen wired, that is what you get. In addition, I found with both these and with the vLeve that they are more than usually sensitive to amplifier quality. With the wireless option though, you are using the built-in amplifier that is specifically designed for the speakers in the vFree. Against that though, wired is a better electrical connection than Bluetooth, so it is a trade-off. Another factor is whether your mobile device supports the higher quality apt-x codec over Bluetooth; many do not, though the vFree does support it.

All of this makes it hard to state definitively how the vFree sounds; it will depend on your set-up. At their best they sound very good, though I doubt wired use with a dedicated amplifier will form typical usage for most. In the other configurations, I found them decent but not outstanding. The bass is particularly clean and tuneful, as you might expect from a supplier of sub-woofers, and the sound in general is refined, never brash or harsh, but lacking the spaciousness that characterises the very best audio.

Comfort is a personal thing; I found the vFree fine for an hour or two but would not want to wear them for longer; but they are soft and lightweight.

These are high quality headphones, though not good value at the full price listed on the Velodyne site. Fortunately you can get them elsewhere for considerably less, making them worth consideration.

Specifications

  • Frequency response: 20Hz – 20kHz
  • Impedance: 32 Ω
  • Range: Up to 10m
  • Sensitivity: 98 dB/1 kHz/1mW
  • Codecs: SBC, AAC, apt-X
  • Battery life: 100 hours standby, 10 hours talk and music, 1.5 hours recharge time