Category Archives: Notes from the field

Hands on with Windows Virtual Desktop

Microsoft’s Windows Virtual Desktop (WVD) is now in preview. This is virtual Windows desktops on Azure, and the first time Microsoft has come forward with a fully integrated first-party offering. There are also a few notable features:

– You can use a multi-session edition of Windows 10 Enterprise. Normally Windows 10 does not support concurrent sessions: if another user logs on, any existing session is terminated. This is an artificial restriction which is more to do with licensing than technology, and there are hacks to get around it but they are pointless presuming you want to be correctly licensed.

– You can use Windows 7 with free extended security updates to 2023. As standard, Windows 7 end of support is coming in January 2020. Without Windows Virtual Desktop, extended security support is a paid for option.

– Running a VDI (Virtual Desktop Infrastructure) can be expensive but pricing for Windows Virtual Desktop is reasonable. You have to pay for the Azure resources, but licensing comes at no extra cost for Microsoft 365 users. Microsoft 365 is a bundle of Office 365, Windows InTune and Windows 10 licenses and starts at £15.10 or $20 per month. Office 365 Business Premium is £9.40 or $12.50 per month. These are small business plans limited to 300 users.

Windows Virtual Desktop supports both desktops and individual Windows applications. If you are familiar with Windows Server Remote Desktop Services, you will find many of the same features here, but packaged as an Azure service. You can publish both desktops and applications, and use either a client application or a web browser to access them.

What is the point of a virtual desktop when you can just use a laptop? It is great for manageability, security, and remote working with full access to internal resources without a VPN. There could even be a cost saving, since a cheap device like a Chromebook becomes a Windows desktop anywhere you have a decent internet connection.

Puzzling out the system requirements

I was determined to try out Windows Virtual Desktop before writing about it so I went over to the product page and hit Getting Started. I used a free trial of Azure. There is a complication though which is that Windows Virtual Desktop VMs must be domain joined. This means that simply having Azure Active Directory is not enough. You have a few options:

Azure Active Directory Domain Services (Azure ADDS) This is a paid-for azure service that provides domain-join and other services to VMs on an Azure virtual network. It costs from about £80.00 or $110.00 per month. If you use Azure ADDS you set up a separate domain from your on-premises domain, if you have one. However you can combine it with Azure AD Connect to enable sign-on with the same credentials.

There is a certain amount of confusion over whether you can use WVD with just Azure ADDS and not AD Connect. The docs say you cannot, stating that “A Windows Server Active Directory in sync with Azure Active Directory” is required. However a user reports success without this; of course there may be snags yet to be revealed.

Azure Active Directory with AD Connect and a site to site VPN. In this scenario you create an Azure virtual network that is linked to your on-premises network via a site to site VPN. I went this route for my trial. I already had AD Connect running but not the VPN. A VPN requires a VPN Gateway which is a paid-for option. There is a Basic version which is considered legacy, so I used a VPNGw1 which costs around £100 or $140 per month.

Update: I have replaced the VPN Gateway with once using the Basic sku (around £20.00 or $26.00 per month) and it still works fine. Microsoft does not recommend this for production but for a very small deployment like mine, or for testing, it is much more cost effective.

This solution is working well for me but note that in a production environment you would want to add some further infrastructure. The WVD VMs are domain-joined to the on-premises AD which means constant network traffic across the VPN. AD integrates with DNS so you should also configure the virtual network to use on-premises DNS. The solution would be to add an Azure-hosted VM on the virtual network running a domain controller and DNS. Of course this is a further cost. Running just Azure ADDS and AD Connect is cheaper and simpler if it is supported.

Incidentally, I use pfsense for my on-premises firewall so this is the endpoint for my site-to-site VPN. Initially it did not work. I am not sure what fixed it but it may have been the TCP MSS Clamping referred to here. I set this to 1350 as suggested. I was happy to see the connection come up in pfsense.

image 

Setup options

There are a few different ways to set up WVD. You start by setting some permissions and creating a WVD Tenant as described here. This requires PowerShell but it was pretty easy.

image

The next step is to create a WVD host pool and this was less straightforward. The tutorial offers the option of using the Azure Portal and finding Windows Virtual Desktop – Provision a host pool in the Azure Marketplace. Or you can use an Azure Resource Manager template, or PowerShell.

I used the Azure Marketplace, thinking this would be easier. When I ran into issues, I tried using PowerShell, but had difficulty finding the special Windows 10 Enterprise Virtual Desktop edition via this route. So I went back to the portal and the Azure marketplace.

Provisioning the host pool

Once your tenant is created, and you have the system requirements in place, it is just a matter of running through a wizard to provision the host pool. You start by naming it and selecting a desktop type: Pooled for multi-session Windows 10, or Personal for a VM per user. I went for the Pooled option.

image

Next comes VM configuration. I stumbled a bit here. Even if you specify just 10 (or 1) users, the wizard recommends a fairly powerful VM, a D8s v3. I thought this would be OK for the trial, but it would not let me continue using the trial subscription as it is too expensive. So I ended up with a D4s v3. Actually, I also tried using a D4 v3 but that failed to deploy because it does not support premium storage. So the “s” is important.

image

The next dialog has some potential snags.

image

This is where you choose an OS image, note the default is Windows 10 Enterprise multi-session, for a pooled WVD. You also specify a user which becomes the default for all the VMs and is also used to join the VMs to the domain. These credentials are also used to create a local admin account on the VM, in case the domain join fails and you need to connect (I did need this).

Note also that the OU path is specified in the form OU=wvd,DC=yourdomain,DC=com (for example). Not just the name of an OU. Otherwise you will get errors on domain join.

Finally take care with the virtual network selection. It is quite simple: if you are doing what I did and domain-joining to an on-premises domain, the virtual network and subnet needs to have connectivity to your on-premises DCs and DNS.

The next dialog is pretty easy. Just make sure that you type in the tenant name that you created earlier.

image

Next you get a summary screen which validates your selections.

image

I suggest you do not take this validation too seriously. I found it happily validated a non-working configuration.

Hit OK and you can deploy your WVD host pool. This takes a few minutes, in my case around 10-15 minutes when it works. If it does not work, it can fail quickly or slowly depending on where in the process it fails.

My problem, after fixing issues like using the wrong type of OS image, was failure to join the VM to the domain. I could not see why this did not work. The displayed error may or may not be useful.

image

If the deployment got as far as creating the VM (or VMS), I found it helpful to connect to a VM to look at its event viewer. I could connect from my on-premises network thanks to the site to site VPN.

I discovered several issues before I got it working. One was simple: I mistyped the name of the vmjoiner user when I created it so naturally it could not authenticate. I was glad when it finally worked.

image

Connection

Once I got the host pool up and running my trial WVD deployment was fine. I can connect via a special Remote Desktop Client or a browser. The WVD session is fast and responsive and the VPN to my office rather handy.

image

Observations

I think WVD is a good strategic move from Microsoft and will probably be popular. I have to note though that setup is not as straightforward as I had hoped. It would benefit Microsoft to make the trial easier to get up and running and to improve the validation of the host pool deployment.

It also seems to me that for small businesses an option to deploy with only Azure ADDS and no dependency on an on-premises AD is essential.

As ever, careful network planning is a requirement and improved guidance for this would also be appreciated.

Update:         

There seems to a problem with Office licensing. I have an E3 license. It installs but comes up with a licensing error. I presume this is a bug in the preview.    

image

This was my mistake as it turned out. You have to take some extra steps to install Office Pro Plus on a terminal server, as explained here. In my case, I just added the registry key SharedComputerLicensing with a setting of 1 under HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office\ClickToRun\Configuration. Now it runs fine. Thanks to https://twitter.com/getwired for the tip.

Microsoft’s Dynamics CRM 2016/365: part brilliant, part perplexing, part downright sloppy

I have just completed a test installation of Microsoft’s Dynamics CRM on-premises; it is now called Dynamics 365 but the name change is cosmetic, and in fact you begin by installing Dynamics CRM 2016 and it becomes Dynamics 365 after applying a downloaded update.

Microsoft’s Dynamics product has several characteristics:

1. It is fantastically useful if you need the features it offers

2. It is fantastically expensive for reasons I have never understood (other than, “because they can”)

3. It is tiresome to install and maintain

I wondered if the third characteristic had improved since I last did a Dynamics CRM installation, but I feel it has not much changed. Actually the installation went pretty much as planned, though it remains fiddly, but I wasted considerable time setting up email synchronization with Exchange (also on-premises). This is a newish feature called Server-Side Synchronization, which replaces the old Email Router (which still exists but is deprecated). I have little love for the Email Router, which when anything goes wrong, fills the event log with huge numbers of identical errors such that you have to disable it before you can discover what is really going wrong.

Email is an important feature as automated emails are essential to most CRM systems. The way the Server-Side Synchronization works is that you configure it, but CRM mailboxes are disabled until you complete a “Test and Enable” step that sends and receives test emails. I kept getting failures. I tried every permutation I could think of:

  • Credentials set per-user
  • Credentials set in the server profile (uses Exchange Impersonation to operate on behalf of each user)
  • Windows authentication (only works with Impersonation)
  • Basic authentication enabled on Exchange Web Services (EWS)

All failed, the most common error being “Http server returned 401 Unauthorized exception.” The troubleshooting steps here say to check that the email address of the user matches that of the mailbox; of course it did.

An annoyance is that on my system the Test and Enable step does not always work (in other words, it is not even tried). If I click Test and Enable in the Mailbox configuration window, I get this dialog:

image

However if I click OK, nothing happens and the dialog stays. If I click Cancel nothing happens and the dialog stays. If I click X the dialog closes but the test is not carried out.

Fortunately, you can also access Test and Enable from the Mailbox list (select a mailbox and it appears in the ribbon). A slightly different dialog appears and it works.

I was about to give up. I set Windows authentication in the server profile, which is probably the best option for most on-premises setups, and tried the test one more time. It worked. I do not know what changed. As this tech note (which is about server-side synchronization using Exchange Online) remarks:

If you get it right, you will hear Microsoft Angels singing

But what’s this about sloppy? There is plenty of evidence. Things like the non-functioning dialog mentioned above. Things like the date which shows for a mailbox that has not been tested:

image

Or leaving aside the email configuration, things like the way you can upload Word templates for use in processes, but cannot easily download them (you can use a tool like the third-party XRMToolbox).

And the script error dialog which has not changed for a decade.

Or the warning you get when viewing a report in Microsoft Edge, that the browser is not supported:

image

so you click the link and it says Edge is supported.

Or even the fact that whenever you log on you get this pesky dialog:

image

So you click Don’t show this again, but it always reappears.

It seems as if Microsoft does not care much about the fit and finish of Dynamics CRM.

So why do people persevere – in fact, the Dynamics business is growing for Microsoft, largely because of Dynamics 365 online and its integration with Office 365. The cloud is one reason, which removes at least some of the admin burden. The other thing though is that it does bring together a set of features that make it invaluable to many businesses. You can use it not only for sales and marketing, but for service case management, quotes, orders and invoices.

It is highly customizable, which is a mixed blessing as your CRM installation becomes increasingly non-standard, but does mean that most things can be done with sufficient effort.

In the end, it is all about automation, and can work like magic with the right carefully designed custom processes.

With all those things to commend it, it would pay Microsoft to work at making the user interface less annoying and the administration less prone to perplexing errors.

Configuring the Android emulator for Hyper-V

Great news that the Android emulator now supports Hyper-V, but how do you enable it?

Pretty simple. First, you have to be running at least Windows 10 1803 (April 2018 update). Then, go into Control Panel – Programs – Turn Windows Features on and off and enabled both Hyper-V and the Windows Hypervisor Platform:

image

Note: this is not the same as just enabling Hyper-V. The Windows Hypervisor Platform, or WHPX, is an API for Hyper-V. Read about it here.

Reboot if necessary and run the emulator.

image

TroubleshootIng? Try running the emulator from the command line.

emulator -list-avds

will list your AVDs.

emulator @avdname -qemu -enable-whpx

will run the AVD called avdname using WHPX (Windows Hypervisor Platform). If it fails, you may get a helpful error message.

Note: If you get a Qt library not found error, use the full path to the emulator executable. This should be the one in the emulator folder, not the one in the tools folder. The full command is:

[path-to-android-sdk]\emulator\emulator @[avdname] -qemu -enable-whpx

You can also use the emulator from Visual Studio, though you need Visual Studio 2017 version 15.8 Preview 1 or higher with the Xamarin tools installed. That said, I had some success with starting the Hyper-V emulator separately (use the command above), then using it with a Xamarin project in Visual Studio 15.7.5.

image

Notes from the Field: dmesg error blocks MySQL install on Windows Subsystem for Linux

I enjoy Windows Subsystem for Linux (WSL) on Windows 10 and use it constantly. It does not patch itself so from time to time I update it using apt-get. The latest update upgraded MySQL to version 5.7.22 but unfortunately the upgrade failed. The issue is that dpkg cannot configure it. I saw messages like:

invoke-rc.d: could not determine current runlevel

2002: Can’t connect to local MySQL server through socket ‘/var/run/mysqld/mysqld.sock

After multiple efforts uninstalling and reinstalling I narrowed the problem down to a dmesg error:

dmesg: read kernel buffer failed: Function not implemented

It is true, dmesg does not work on WSL. However there is a workaround here that says if you write something to /dev/kmsg then at least calling dmesg does not return an error. So I did:

sudo echo foo > /dev/kmsg

Removed and reinstalled MySQL one more time and it worked:

image

Apparently partial dmesg support in WSL is on the way, previewed in Build 17655.

Note: be cautious about fully uninstalling MySQL if you have data you want to preserve. Export/backup the databases first.

Notes from the field: Windows Time Service interrupts email delivery

A business with Exchange Server noticed that email was not flowing. The internet connection was fine, all the servers were up and running including Exchange 2016. Email has been fine just a few hours earlier. What was wrong?

The answer, or the beginning of the answer, was in the Event Viewer on the Exchange Server. Event ID 1035, only a warning:

Inbound authentication failed with error UnexpectedExchangeAuthBlobCheckForClockSkew for Receive connector Default Mailbox Delivery

Hmm. A clock problem, right? It turned out that the PDC for the domain was five minutes fast. This is enough to trigger Kerberos authentication failures. Result: no email. We fixed the time, restarted Exchange, and everything worked.

Why was the PDC running fast? The PDC was configured to get time from an external source, apparently, and all other servers to get their time from the PDC. Foolproof?

Not so. If you typed:

w32tm /query /status

at a command prompt on the PDC (not the Exchange Server, note), it reported:

Source: Free-running System Clock

Oops. Despite efforts to do the right thing in the registry, setting the Type key in HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\W32Time\Parameters to NTP and entering a suitable list of time servers in the NtpServer key, it was actually getting its time from the server clock. This being a Hyper-V VM, that meant the clock on the host server, which – no surprise – was five minutes fast.

You can check for this error by typing:

w32tm /resync

at the command prompt. If it says:

The computer did not resync because no time data was available.

then something is wrong with the configuration. If it succeeds, check the status as above and verify that it is querying an internet time server. If it is not querying a time server, run a command like this:

w32tm /config /update /manualpeerlist:”0.pool.ntp.org,0x8 1.pool.ntp.org,0x8 2.pool.ntp.org,0x8 3.pool.ntp.org,0x8″ /syncfromflags:MANUAL

until you have it right.

Note this is ONLY for the server with the PDC Emulator FSMO role. Other servers should be configured to get time from the PDC.

Time server problems seem to be common on Windows networks, despite the existence of lots of documentation. There are also various opinions on the best way to configure Hyper-V, which has its own time synchronization service. There is a piece by Eric Siron here on the subject, and I reckon his approach is a safe one (Hyper-V Synchronization Service OFF for the PDC Emulator, ON for every other VM).

I love his closing remarks:

The Windows Time service has a track record of occasionally displaying erratic behavior. It is possible that some of my findings are not entirely accurate. It is also possible that my findings are 100% accurate but that not everyone will be able to duplicate them with 100% precision. If working with any time sensitive servers or applications, always take the time to verify that everything is working as expected.

Kaspersky encrypted connection scanning breaks ADFS login, internet-facing Dynamics CRM

I was asked to look at a case where a user could not log in to Dynamics CRM. This is an internet-facing deployment which uses ADFS (Active Directory Federation Services). The user put in valid credentials but received a 401 – unauthorized: Access is denied due to invalid credentials.

The odd thing from the user’s perspective is that everything worked fine on other PCs; but switching web browsers did not fix it.

I noticed that Kaspersky anti-virus was installed.

image

Pausing Kaspersky made no difference to the error. However I came back to this after eliminating some other possible problems. I noticed that if you looked at the certificate on the ADFS site it was not from the site itself, but a Kaspersky certificate.

image

The reason for this is that Kaspersky wants to inspect encrypted traffic for malware.

I understand the rationale but I dislike this behaviour. Your security software should not hide the SSL certificate of the web site you are visiting. Of course it is particularly dislikeable if it breaks stuff, as in this case. I found the setting in Kaspersky and disabled both this feature, and another which injects script into web traffic (though this proved not to be the culprit here), for the sake of Kaspersky’s “URL Advisor”).

Personally I feel that encrypted traffic should only be decrypted in the recipient application. Kaspersky’s feature is an SSL Man-in-the-Middle attack and to my mind reduces rather than increases the security of the PC. However you made the decision to trust your anti-virus vendor when you installed the software.

There are other anti-virus solutions that also do this so Kaspersky is not alone. As to why it breaks ADFS I am not sure, but regard this as a good thing since the user’s SSL connection is compromised.

image

As it turns out, it isn’t essential to disable the feature entirely. You can simply set an exclusion for the ADFS site by clicking Manage exclusions.

Posted here in case others hit this issue.

Unhealthy Identity synchronization Notification: a trivial solution (and Microsoft’s useless troubleshooter)

If you use Microsoft’s AD Connect, also known as DirSync, you may have received an email like this:

image

It’s bad news: your Active Directory is not syncing with Office 365. “Azure Active Directory did not register a synchronization attempt from the Identity synchronization tool in the last 24 hours.”

I got this after upgrading AD Connect to the latest version, currently 1.1.553.

The email recommends you run a troubleshooting tool on the AD Connect server. I did that. Nothing wrong. I rebooted, it synced once, then I got another warning.

This is only a test system but I still wanted to find out what was wrong. I tweaked the sync configuration, again without fixing the issue.

Finally I found this post. Somehow, AD Connect had configured itself not to sync. You can get the current setting in PowerShell, using get-adsyncscheduler:

image

As you can see, SyncCycleEnabled is set to false. The fix is trivial, just type:

set-adsyncscheduler –SyncCycleEnabled $true

Well, I am glad to fix it, but should not Microsoft’s troubleshooting tool find this simple configuration problem?

How to remove the WINS server feature from Windows Server

The WINS service is not needed in most Windows networks but may be running either for legacy reasons, or because someone enabled it in the hope that it might fix a network issue.

It is now apparently a security risk. See here and Reg article here.

Apparently Microsoft says “won’t fix” despite the service still being shipped in Server 2016, the latest version:

In December 2016, FortiGuard Labs discovered and reported a WINS Server remote memory corruption vulnerability in Microsoft Windows Server. In June of 2017, Microsoft replied to FortiGuard Labs, saying, "a fix would require a complete overhaul of the code to be considered comprehensive. The functionality provided by WINS was replaced by DNS and Microsoft has advised customers to migrate away from it." That is, Microsoft will not be patching this vulnerability due to the amount of work that would be required. Instead, Microsoft is recommending that users replace WINS with DNS.

It should be removed then. I noticed it was running on a server in my network, running Server 2012 R2, and that although it was listed as a feature in Server Manager, the option to remove it was greyed out.

I removed it as follows:

1. Stop the WINS service and set it to manual or disabled.

2. Remove the WINS option in DHCP Scope Options if it is present.

3. Run PowerShell as an administrator and execute the following command:

uninstall-windowsfeature wins

This worked first time, though a restart is required.

Incidentally, if Microsoft ships a feature in a Server release, I think it should be kept patched. No doubt the company will change its mind if it proves to be an issue.

Note: you can also use remove-windowsfeature which is an alias for uninstall-windowsfeature. You do need Windows Server 2008 R2 or higher for this to work.

Fixing “couldn’t parse private ssl key” in Dovecot

I run Debian Linux including a mail server, and part of the system is Dovecot, an open source IMAP and POP3 server which has always worked well for me.

Unfortunately it stopped working after an upgrade. With Linux I am in the habit of doing:

apt-get update

apt-get upgrade

to keep the system patched, and normally everything works fine. Occasionally it does not, and then I need to dig in and work out what is wrong and how to fix it. The upgrade to Apache 2.4, for example, was somewhat painful because of changed configuration directives.

This time it was Dovecot that broke. I use Thunderbird to pick up POP3 mail, and nothing was flowing. Eventually I found the problem logged in syslog:

Fatal: Couldn’t parse private ssl_key: error:0906D06C:PEM routines:PEM_read_bio:no start line: Expecting: ANY PRIVATE KEY

I puzzled over this for some time. The path to the private key was correct in dovecot.conf. The permissions were OK. I regenerated the certificate (it’s self-signed) but still the same.

Eventually I found the solution here. The path to the SSL certs used to look like this:

ssl_cert = /etc/ssl/certs/dovecot.pem
ssl_key = /etc/ssl/private/dovecot.pem

Now it must look like this:

ssl_cert = </etc/ssl/certs/dovecot.pem
ssl_key = </etc/ssl/private/dovecot.pem

Yes, you need that angle bracket, otherwise you get the error.

It used to work, so at some point the Dovecot coders took out the compatibility code that allowed the old-style directive.

Mentioned here in case it helps someone find the solution.

Disabling automatic update restarts in Windows Server 2016

Windows Server 2016 is in effect the Windows 10 version of the server OS. If you look in Settings it seems to have the same attitude to updates; in other words, you get them automatically whether you like it or not. Currently my server is even offering me Windows 10 Creators Update:

image

However, I prefer to have servers just download updates and let me decide when to install them. There can be good reasons for this. For example, I run Exchange Server on a machine that is not really up to spec, and the Exchange services have to be manually started every time it reboots. Well, there are ways round this, but it makes the point.

It turns out that you can after all set Windows Server 2016 to download-only. Just run sconfig from the command line and choose option 5:

image

The sconfig menu will be familiar if you have worked with Server Core or other variants of Windows Server without a GUI.

Incidentally, I tried to install Exchange 2016 on Server 2016 without a GUI but it appears not to be supported. A shame.

Returning to the subject of updates, Brendan Power at Microsoft popped up on Reddit to say that this is a bug in in the settings:

The "Available updates will be downloaded…" text in the UI is a bug that doesn’t represent the actual automatic update settings.

To verify the actual server settings, you can open the command prompt and run sconfig.cmd; in the menu, you should see option 5 set to Manual.

A bug? I am not sure. If so, it seems an odd and obvious one. I think Microsoft is keen to have us update automatically. That said, Windows Server 2016 is meant to follow the Long Term Servicing Branch (LTSB) model rather than the “Windows as a service” approach in Windows 10, unless you run Nano Server, according to this post. So compulsory update to retain a supported configuration does not apply here.

Of course you should patch your Windows Server installations in a timely manner, however you choose to do it.