All posts by onlyconnect

Google’s Digital Garage, hosted by UK City Councils

I have recently moved into a new area and noticed that my (now) local city council was running a Google Digital Garage:

Winchester City Council is very excited to be partnering up with The Digital Garage from Google – a digital skills training platform to assist you in growing your business, career and confidence, online. Furthermore, a Google digital expert is coming to teach you what is needed to gain a competitive advantage in the ever changing digital landscape, so come prepared to learn and ask questions, too.

I went along as a networking opportunity and learn more about Google’s strategy. The speaker was from Google partner Uplift Digital, “founded by Gori Yahaya, a digital and experiential marketer who had spent years working on behalf of Google, training and empowering thousands of SMEs, entrepreneurs, and young people up and down the country to use digital to grow their businesses and further their careers.”

I am not sure “digital garage” was the right name in this instance, as it was essentially a couple of presentations which not much interaction and no hands-on. The first session had three themes:

  • Understanding search
  • Manage your presence on Google
  • Get started with paid advertising

What we got was pretty much the official Google line on search: make sure your site performs well on mobile as well as desktop, use keywords sensibly, and leave the rest to Google’s algorithms. The second topic was mainly about Google’s local business directory called My Business. Part three introduced paid advertising, mainly covering Google AdWords. No mention of click fraud. Be wary of Facebook advertising, we were told, since advertising on Facebook may actually decrease your organic reach, it is rumoured. Don’t bother advertising on Twitter, said the speaker.

image

Session two was about other ways to maintain a digital presence, mainly looking at social media, along with a (rather unsatisfactory) introduction to Google Analytics. The idea is to become an online authority in what you do, we were told. Good advice. YouTube is the second most popular search engine, we were told, and we should consider posting videos there. The speaker recommended the iOS app YouTube Director for Business, a free tool which I later discovered is discontinued from 1st December 2017; it is being replaced by Director Onsite which requires you to spend $150 on YouTube advertising in order to post a video.

Overall I thought the speaker did a good job on behalf of Google and there was plenty of common sense in what was presented. It was a Google-centric view of the world which considering that it is, as far as I can tell, entirely funded by Google is not surprising.

As you would also expect, the presentation was weak concerning Facebook, Twitter and other social media platforms. Facebook in particular seems to be critically important for many small businesses. One lady in the audience said she did not bother with a web site at all since her Facebook presence was already providing as many orders for her cake-making business as she could cope with.

We got a sanitised view of the online world which in reality is a pretty mucky place in many respects.

IT vendors have always been smart about presenting their marketing as training and it is an effective strategy.

The aspect that I find troubling is that this comes hosted and promoted by a publicly funded city council. Of course an independent presentation or a session with involvement from multiple companies with different perspectives would be much preferable; but I imagine the offer of free training and ticking the box for “doing something about digital” is too sweet to resist for hard-pressed councils, and turn a blind eye to Google’s ability to make big profits in the UK while paying little tax.

Google may have learned from Microsoft and its partners who once had great success in providing basic computer training which in reality was all about how to use Microsoft Office, cementing its near-monopoly.

Which Azure Stack is right for you?

I went in search of Azure Stack at Microsoft’s Ignite event. I found a few  in the Expo. It is now shipping and the Lenovo guy said they had sold a dozen or so already.

Why Azure Stack? Microsoft’s point is that it lets you run exactly the same application on premises or in its public cloud. The other thing is that although you have some maintenance burden – power, cooling, replacing bits if they break – it is pretty minimal; the configuration is done for you.

I talked to one of the vendors about the impact on VMware, which dominates the market for virtualisation in the datacentre. My sense in the VMware vs Hyper-V debate is that VMware still has an edge, particularly in its management tools but Hyper-V is solid (aside from a few issues with Cluster Shared Volumes) and a lot less expensive. Azure Stack is Hyper-V of course; and the point the vendor made was that configuring an equivalent private cloud with VMware would be possible but hugely more expensive, not only in license cost but also in the skill needed to set it all up correctly.

So I think this is a smart move from Microsoft.

Why no Dell? They told me it was damaged in transit. Shame.

image
Lenovo

image
Cisco

image

HP Enterprise

Generating code for simple SQL Server data access without Entity Framework, works with .NET Core

I realise that Microsoft’s Entity Framework is the most common approach for data access in the .NET world, but I have also always had good results from a simple manual approach using DbConnection, DbCommand and DataReader objects, and like the fact that I can see and control exactly what SQL gets executed. If you prefer using Entity Framework or another abstraction that is fine and please stop reading now!

One snag with this more manual approach is that you have to write tedious code building SQL statements. I figured that someone must have written a utility application to generate this code but could not find one quickly so I did my own. It supports both C# and Visual Basic. The utility connects to a database and lets you generate a class for each table along with code for retrieving and saving these objects, ready for modification. Here you can see a generated class:

image

and here is an example of the generated data access code:

image

This is NOT complete code (otherwise I would be perilously close to writing my own ORM) but simply automates creating SQL parameters and SQL statements.

One of my thoughts was that this code should work well with .NET core. The SQLClient implements the required classes. Here is my code for retrieving an author object, mostly generated by my utility:

public static ClsAuthor GetAuthor(string authorID)
        {
            SqlConnection conn = new SqlConnection(ConnectString);
            SqlCommand cmd = new SqlCommand();
            SqlDataReader dr;
            ClsAuthor TheAuthor = new ClsAuthor();
            try
            {
                cmd.CommandText = "Select * from Authors where au_id = @auid";
                cmd.Parameters.Add("@auid", SqlDbType.Char);
                cmd.Parameters[0].Value = authorID;
                cmd.Connection = conn;
                cmd.Connection.Open();

                dr = cmd.ExecuteReader();

            if (dr.Read()) {

                    //Get Function
                    TheAuthor.Auid = GetSafeDbString(dr, "au_id");
                    TheAuthor.Aulname = GetSafeDbString(dr, "au_lname");
                    TheAuthor.Aufname = GetSafeDbString(dr, "au_fname");
                    TheAuthor.Phone = GetSafeDbString(dr, "phone");
                    TheAuthor.Address = GetSafeDbString(dr, "address");
                    TheAuthor.City = GetSafeDbString(dr, "city");
                    TheAuthor.State = GetSafeDbString(dr, "state");
                    TheAuthor.Zip = GetSafeDbString(dr, "zip");
                    TheAuthor.Contract = GetSafeDbBool(dr, "contract");
                }
            }
            finally
            {
                conn.Close();
                conn.Dispose();
            }

            return TheAuthor;
        }

Everything worked perfectly and I soon had a table showing the authors, using ASP.NET MVC.

In order to verify that it really does work with .NET Core I moved the project to Visual Studio Mac and ran it there:

image

I may be unusual; but I am reassured that I have a relatively painless way to write a database application for .NET Core without using Entity Framework.

Microsoft updates the .NET stack with .NET Core 2.0 and updated Visual Studio. Should you use it?

Microsoft has released .NET Core 2.0, a major update to its open source, cross-platform version of the .NET runtime and C# language.

New features include implementation of .NET Standard 2.0 (a way of targeting code to run under multiple .NET platforms), new platform support including Debian Stretch, macOS High Sierra and Suse Linux Enterprise Server 12 SP2. There is preview support for both Linux and Windows on ARM32.

.NET Core 2.0 now supports Visual Basic as well as C# and F#. The version of C# has been bumped to 7.1, including async Main method support, inferred tuple names and default expressions.

Microsoft has also released Visual Studio 2017 15.3, which is required if you want to use .NET Core 2.0. New Visual Studio features include Azure Stack support, C’# 7.1 support, .NET Framework 4.7 support, and other new features and fixes.

I updated Visual Studio and downloaded the new .NET Core 2.0 SDK and was soon up and running.

image

Note the statement about “This product collects usage data” of which more below.

image

The sample ASP.NET MVC application worked first time.

image

How is .NET Core doing? The whole .NET picture is desperately confusing and I get the impression that most .NET developers, while they may have paid some attention to what is happening, have concluded that the safe path is to continue with the Window-only .NET Framework.

At the same time, .NET Core is strategically important to Microsoft. Cross-platform support means that C# has a life on the Mac and on Linux, which is vital to its health considering the popularity of the Mac amongst developers, and of Linux as a deployment platform for web applications. Visual Studio for Mac has also been updated and supports .NET Core 2.0 in the new version.

Another key piece is the container trend. .NET Core is ideal for container deployment, and the only version of .NET supported in Windows Nano Server. If you want to embrace microservices running in containers, while still developing with C#, .NET Core and Nano Server is the optimum solution.

Why not use .NET Core, especially since it is faster than ASP.NET? In these comparisons, .NET Core comes out as substantially faster than .NET Framework for various algorithms – 600 times faster in one case.

The main issue is compatibility. .NET Core is a subset of the .NET Framework, and being a relative newcomer, it lacks the same level of third-party support.

Another factor is that there is no support for desktop applications, though some solutions have been devised. Microsoft does have a cross-platform GUI story, in Xamarin Forms, which is now in preview for macOS alongside iOS, Android, Windows and Tizen. If Xamarin used .NET Core that would be a great solution, but it does not (though it does support .NET Standard 2.0).

One of the pieces that most concerns developers is data access. If you use .NET Core you are strongly guided towards Entity Framework Core, a fork of Microsoft’s ORM (Object-Relational Mapping) framework. Someone asked on this page, is EF Core usable? Here’s an answer from one user (11 days ago):

Answering 4 months later but people should know: Definitely not, it is still not usable unless you are doing something very trivial and/or have very small DB.
I don’t understand how it is possible for MS to ship it, act like it’s OK and sparsely here and there provide shallow information about its limitations like in this article without warning clearly and explicitly about the serious issues this “v1 product” has.

Someone may jump in and say no, it is fine; but there are undoubtedly missing pieces and I would suggest caution.

You can also access data using the Connection/Command/DataReader approach which avoids EF, and although this is more work, this is what I would be inclined to do personally since you get the best performance and flexibility. Here is an example for SQL Server.

Who is using .NET Core? Controversially, Microsoft gathers telemetry from your use of the command-line tools though you can opt out by setting an environment variable. This means we have some data on .NET Core usage, though unfortunately it excludes Visual Studio usage. I downloaded the most recent dataset and imported it into a database. Here are the figures for OS family:

Total rows 5,036,981
Windows 3,841,922 (76.27%)
Linux 887,750 (17.62%)
Mac 307,309 (6.1%)

image

Given that this excludes Visual Studio users, who are also on Windows, we can conclude that the great majority of .NET Core developers use Windows, and only a tiny minority Mac (I do not know if Visual Studio for Mac usage is included). This is evidence that .NET Core has so far failed in its goal of persuading Mac-using developers to adopt .NET. It does show interest in deploying .NET applications to Linux, which is an obvious win in licensing costs as well as performance.

I would be interested in comments from developers on whether or not they use .NET Core and why.

An overreaching Office 365 integration from Sage

Sage, a software vendor best known for its accounting software, recently introduced an Office 365 integration in its products called Sage 50C Accounts (the “C” is for cloud).

The integration offers several features including:

  • Automatic data backup to OneDrive
  • Contact integration so that you can easily see Sage accounts data for contacts in Office 365/Outlook
  • A mobile app that lets you capture receipts with your smartphone and import them
  • Excel reports
  • A Business Performance Dashboard

image

Very good; but how is this implemented? Users get a special Getting Started email which says:

Are you ready to integrate your Microsoft Office 365 account with Sage 50c Accounts? All you need to do is click Get Started and sign in using the administrator account for your Office 365 Business Premium subscription, and we will guide you through accepting terms and conditions, how to sync your data and setup the Sage apps and users

To sign in, you’ll enter your email and password for your administrator account. Your email is formatted as follows: xxx@xxx.onmicrosoft.com. If you have forgotten your Office 365 administrator password, please click here for more information.

You’ll be asked to accept a provider invitation to give us permission to activate the Sage add-ins for your Office 365 account. Easy.

If you know Office 365 you will spot something odd in the above. Sage is asking you not just to install an Office 365 application, but to “accept a provider invitation”.

It is as bad as it sounds. In order to get its integration working, Sage demands that you appoint it as a Cloud Solution Provider (CSP) for your entire Office 365 tenancy. This does not require that you start paying for your tenancy via Sage, as it can be alongside an existing CSP relationship. However it does give Sage complete access to the tenancy including the ability to reset the global administrator password.

While I do not think it is likely that Sage will do anything bad, this is a lot to ask. It means that in the unlikely event that Sage has its systems compromised, your Office 365 data is at risk.

It gets worse. Once you have agreed to hand over the keys to your Office 365 kingdom, you click a “Let’s get started” button in Sage 50C Accounts on your desktop. You have to log in as manager (a local Sage administrator) and then enter the credentials for your Office 365 global administrator. These credentials are then stored by Sage for 90 days and used to perform synchronization. After 90 days, it will demand that the credentials are entered again.

And by the way, you will need an Office 365 Business Premium license for the global administrator, even though it does not normally use that license for day to day work.

Why is this bad? First, it is a mis-use of the global administrator account. Best practice is that this account is used only for Office 365 administration. It should not be an active user account for email, OneDrive etc, since this increases the risk of the account being compromised.

Second, end users (such as those in the accounts department) do not normally have knowledge of the global administrator credentials. Therefore to perform this operation, they will need to contact their IT support every 90 days.

Third, the fact that Sage has these credentials on a user’s PC, albeit I presume encrypted, adds a possible attack mechanism for your Office 365 tenancy. If the PC became hijacked or infected with malware, some bad guy can now start trying to figure out if there is a way of persuading Sage to do something bad.

Fourth, it is not even wise to enter these credentials on an end user PC. Perhaps I am being excessively cautious, but it is obvious that an end-user PC that is used for day to day work, web browsing and so forth, by someone non-specialist in IT terms, is more vulnerable than an administrator’s PC. If a keylogger were installed, then there is an opportunity to grab the global administrator credentials every 90 days.

Frankly, I do not recommend that businesses use this integration in its current implementation. Nor is it necessary. There are plenty of ways to create Office 365 applications that integrate nicely using the APIs which Microsoft has provided. Maybe there is a feature or two which is difficult to implement without these rights; in this case, the correct thing to do is to badger Microsoft to provide a new API, or perhaps to recognise that the security cost of adding the feature is not worth the value which it adds.

My suspicion is that Sage has gone down this path by a process of evolution. It set itself up as an Office 365 CSP (before doing this integration) in order to get some extra business, which is fair enough. Then it started adding value to its Office 365 tenants, making use of what it could do as the customer’s CSP. Then it wanted to extend that to other Office 365 customers, those for whom it was not the CSP, and went down the path of least resistance, “oh, let’s just require that we become their CSP as well.”

Imagine if other third-party vendors go down this route. Your specialist business software supplier, your CRM supplier, your marketing software, all demanding total access and control over your Office 365 setup.

It is overreaching and disappointing that Microsoft CEO Satya Nadella blessed this integration with a quote about “empowering professionals” when the truth is that this is the wrong way to go about it.

The downside of “Windows as a service”: disappearing features (and why I will miss Paint)

Microsoft has posted a list of features that are “removed or deprecated” in the next major update to Windows 10, called the Fall Creators Update.

The two that caught my eye are Paint, a simple graphics editor whose ancestry goes right back to Windows 1.0 in 1985, and System Image Backup, a means of backing up Windows that preserves applications, settings and documents.

I use Paint constantly. It is ideal for cropping screenshots and photos, where you want a quick result with no need for elaborate image processing. It starts in a blink, lets you resize images while preserving aspect ratio, and supports .BMP, .GIF, .JPG, .PNG and .TIF – all the most important formats.

I used Paint to crop the following screen, of the backup feature to be removed.

image

System Image Backup is the most complete backup Windows offers. It copies your system drive so that you can restore it to another hard drive, complete with applications and data. By contrast, the “modern” Windows 10 backup only backs up files and you will need to reinstall and reconfigure the operating system along with any applications if your hard drive fails and you want to get back where you were before. “We recommend that users use full-disk backup solutions from other vendors,” says Microsoft unhelpfully.

If System Image Backup does stop working, take a look at Disk2vhd which is not entirely dissimilar, but copies the drive to a virtual hard drive; or the third party DriveSnapshot which can backup and restore entire drives. Or of course one of many other backup systems.

The bigger picture here is that when Microsoft pitched the advantages of “Windows of a service”, it neglected to mention that features might be taken away as well as added.

Microsoft Edge browser crashing soon after launch: this time, it’s IBM Trusteer Rapport to blame

A common problem (I am not sure how common, but there are hundreds of reports) with the Edge browser in Windows 10 is that it gets into the habit of opening and then immediately closing, or closing when you try to browse the web.

I was trying to fix a PC with these symptoms. In the event log, an error was logged “Faulting module name: EMODEL.dll.” Among much useless advice out there, there is one that has some chance. You can reinstall Edge by following a couple of steps, as described in various places. Something like this (though be warned you will lose ALL your Edge settings, favourites etc):

Delete C:\Users\%username%\AppData\Local\Packages\Microsoft.MicrosoftEdge_8wekyb3d8bbwe (a few files may get left behind)

Reboot

Run Powershell then Get-AppXPackage -Name Microsoft.MicrosoftEdge | Foreach {Add-AppxPackage -DisableDevelopmentMode -Register "$($_.InstallLocation)\AppXManifest.xml" -Verbose}

However this did not fix the problem – annoying after losing the settings. I was about to give up when I found this thread. The culprit, for some at lease, is IBM Trusteer Rapport and its Early Browser Protection feature. I disabled this, rebooted, and Edge now works.

Failing that, you can Stop or uninstall Rapport and that should also fix the problem.

Licensing Azure Stack: it’s complicated (and why Azure Stack is the iPad of servers)

Microsoft’s Azure Stack is a pre-configured, cut-down version of Microsoft’s mighty cloud platform, condensed into an appliance-like box that you can install on your own premises.

Azure Stack is not just a a new way to buy a bunch of Windows servers. Both the technical and the business model are different to anything you have seen before from Microsoft. On the technical side, your interaction with Azure Stack is similar to your interaction with Azure. On the business side, you are buying the hardware, but renting the software. There is no way, according to the latest pricing and licensing guide, to purchase a perpetual license for the software, as you can for Windows Server. Instead, there are two broad options:

Pay-as-you-use

In this model, you buy software services on Azure Stack in exactly the same way as you do on Azure. The fact that you have bought your own hardware gets you a discount (probably). The paper says “Azure Stack service fees are typically lower than Azure prices”.

Service
Base virtual machine $0.008/vCPU/hour ($6/vCPU/month)
Windows Server virtual machine $0.046/vCPU/hour ($34/vCPU/month)
Azure Blob Storage $0.006/GB/month (no transaction fee)
Azure Table and Queue $0.018/GB/month (no transaction fee)
Azure App Service (Web Apps, Mobile Apps, API Apps, Functions) $0.056/vCPU/hour ($42/vCPU/month)

This has the merit of being easy to understand. It gets more complex if you take the additional option of using existing licenses with Azure Stack. “You may use licenses from any channel (EA, SPLA, Open, and others),” says the guide, “as long as you comply with all software licensing and product terms.” That qualification is key; those documents are not simple. Let’s briefly consider Windows Server 2016 Standard, for example. Licensing is per core. To install Windows Server 2016 Standard on a VM, you have to license all the cores in the physical server, even if your VM only has one virtual CPU. The servers in Azure Stack, I presume, have lots of cores. Even when you have done this, you are only allowed to install it on up to two VMs. If you need it on a third VM, you have to license all the cores again. Here are the relevant words:

Standard Edition provides rights for up to 2 Operating System Environments or Hyper-V containers when all physical cores in the server are licensed. For each additional 1 or 2 VMs, all the physical cores in the server must be licensed again.

Oh yes, and once you have done that, you need to purchase CALs as well, for every user or device accessing a server. Note too that on Azure Stack you always have to pay the “base virtual machine” cost in addition to any licenses you supply.

This is why the only sane way to license Windows Server 2016 in a virtualized environment is to use the expensive Datacenter edition. Microsoft’s pay-as-you-use pricing will be better for most users.

Capacity model

This is your other option. It is a fixed annual subscription with two variants:

App Service, base virtual machines and Azure Storage $400 per core per year
Base virtual machines and Azure Storage only $144 per core per year

The Capacity Model is only available via an Enterprise Agreement (500 or more users or devices required); and you still have to bring your own licenses for Windows Server, SQL Server and any other licensed software required. Microsoft says it expects the capacity model to be more expensive for most users.

SQL Server

There are two ways to use SQL Server on Azure. You can use a SQL database as a service, or you can deploy your own SQL Server in a VM.

The same is true on Azure Stack; but I am not clear about how the licensing options if you offer SQL databases as a service. In the absence of any other guidance, it looks as if you will have to bring your own SQL Server license, which will make this expensive. However it would not surprise me if this ends up as an option in the pay-as-you-use model.

Using free software

It is worth noting that costs for both Azure and Azure Stack come way down if you use free software, such as Linux rather than Windows Server, and MySQL rather than SQL Server. Since Microsoft is making strenuous efforts to make its .NET application development framework cross-platform, that option is worth watching.

Support

You will have to get support for Azure Stack, since it is not meant to be user-serviceable. And you will need two support contracts, one with Microsoft, and one with your hardware provider. The hardware support is whatever you can negotiate with the hardware vendor. Microsoft support will be part of your Premier, Azure or Partner support in most cases.

Implications of Azure Stack

When Microsoft embarked on its Azure project, it made the decision not to use System Center, its suite of tools for managing servers and “private cloud”, but to create a new way to manage servers that is better automated, more scalable, and easier for end-users. Why would you use System Center if you can use Azure Stack? Well, one obvious reason is that with Azure Stack you are ceding a lot of control to Microsoft (and to your hardware supplier), as well as getting pushed down a subscription path for your software licensing. If you can handle that though, it does seem to me that running Azure Stack is going to be a lot easier and more productive than building your own private cloud, for most organizations.

This presumes of course that it works. The big risk with Azure Stack is that it breaks; and your IT administrators will not know how to fix it, because that responsibility has been outsourced to your hardware vendor and to Microsoft. It is possible, therefore, than an Azure Stack problem will be harder to solve than other typical Windows platform failures. A lot will depend on the quality control achieved both by Microsoft, for the software, and its hardware partners.

Bottom line: this is the iPad of servers. You buy it but don’t really control it, and it is a delight to use provided it works.

Thoughts on Petya/NotPetya and two key questions. What should you do, and is it the fault of Microsoft Windows?

Every major IT security incident generates a ton of me-too articles most of which lack meaningful content. Journalists receive a torrent of emails from companies or consultants hoping to be quoted, with insightful remarks like “companies should be more prepared” or “you should always keep your systems and security software patched and up to date.”

An interesting feature of NotPetya (which is also Not Ransomware, but rather a malware attack designed to destroy data and disrupt business) is that keeping your systems and security software patched and up to date in some cases did not help you. Note this comment from a user:

Updated Win10 CU with all new cumulative updates and Win10 Insider Fast latest were attacked and affected. Probably used “admin” shares but anyway – Defender from Enterprise just ignored virus shared through network.

Nevertheless, running a fully updated Windows 10 did mitigate the attack compared to running earlier versions, especially Windows 7.

Two posts about NotPetya which are worth reading are the technical analyses from Microsoft here and here. Reading these it is hard not to conclude that the attack was an example of state-sponsored cyberwarfare primarily targeting Ukraine. The main factors behind this conclusion are the lack of financial incentive (no serious effort to collect payment which in any case could not restore files). Note the following from Microsoft’s analysis:

The VictimID shown to the user is randomly generated using CryptGenRandom() and does not correspond to the MFT encryption, so the ID shown is of no value and is also independent from the per-drive file encryption ID written on README.TXT.

My observations are as follows.

1. You cannot rely on security software, nor on OS patching (though this is still critically important). Another example of this came in the course of reviewing the new SENSE consumer security appliance from F-Secure. As part of the test, I plucked out a recent email which asked me to download a virus (thinly disguised as an invoice) and tried to download it. I succeeded. It sailed past both Windows Defender and F-Secure. When I tested the viral file with VirusTotal only 4 of 58 anti-virus applications detected it.

The problem is that competent new malware has a window of opportunity of at least several hours when it is likely not to be picked up. If during this time it can infect a significant number of systems and then spread by other means (as happened with both WannaCry and NotPetya) the result can be severe.

2. Check your backups. This is the most effective protection against malware. Further, backup is complicated. What happens if corrupted or encrypted files are backed up several times before the problem is spotted? This means you need a backup that can go back in time to several different dates. If your backup is always online, what happens if a network intruder is able to manage and delete your backups? This means you should have offline backups, or at least avoid having a single set of credentials which, if stolen, give an attacker full access to all your backups. What happens if you think you are backed up, but in fact critical files are not being backed up? This is common and means you must do a test restore from time to time, pretending that all your production systems have disappeared.

3. If you are running Windows, run Windows 10. I am sorry to have to say this, in that I recognize that in some respects Windows 7 has a more coherent design and user interface. But you cannot afford to miss out on the security work Microsoft has done in Windows 10, as the second Microsoft article referenced above spells out. 

4. Is it the fault of Microsoft Windows? An interesting discussion point which deserves more attention. The simplistic argument against Windows is that most malware attacks exploit bugs in Windows, therefore it is partly Microsoft’s fault for making the bugs, and partly your fault for running Windows. The more plausible argument is that Windows monoculture in business gives criminals an easy target, with a huge array of tools and expertise on how to hack it easily available.

The issue is in reality a complex one and we should credit Microsoft at least with huge efforts to make Windows more secure. Users, it must be noted, are in many cases resistant to these efforts, perceiving them as an unnecessary nuisance (for example User Access Control in Vista); and historically third-party software vendors have also often got in the way, such as being slow or reluctant to apply digital signatures to software drivers and applications.

Windows 8 was in part an effort to secure Windows by introducing a new and secure model for applications. There are many reasons why this was unsuccessful, but too little recognition of the security aspect of these efforts.

The answer then is also nuanced. If you run Windows you can so with reasonable security, especially if you are serious about it and use features such as Device Guard, which whitelists trusted applications. If you switch your business to Mac or Linux, you might well escape the next big attack, not so much because the OS is inherently more secure, but because you become part of a smaller and less attractive target.

For a better answer, see the next observation.

5. Most users should run a locked-down operating system. This seems rather obvious. Users who are not developers, who use the same half a dozen applications day to day, are better and more safely served by running a computer in which applications are properly isolated from the operating system and on which arbitrary executables from unknown sources are not allowed to execute. Examples are iOS, Android, Chrome OS and Windows 10 S.  Windows 10 Creators Update lets you move a little way in this direction by setting it to allow apps from the Store only:

image

There is a significant downside to running a locked-down operating system, especially as a consumer, in that you cede control of what you can and cannot install to the operating system vendor, as well as paying a fee to that vendor for each paid-for installation. Android and iOS users live with this because it has always been that way, but in Windows the change of culture is difficult. Another issue is limitations in the Windows Store app platform, though this is becoming less of an issue thanks to the Desktop Bridge, which means almost any application can become a Store application. In gaming there is a problem with Steam which is an entire third-party Store system (apparently Steam bypasses the Windows 10 control panel restriction, though it does not run on Windows 10 S). Open source applications are another problem, since few are available in the Windows Store, though this could change.

If we really want Windows to become more secure, we should get behind Windows 10 S and demand better third-party support for the Windows Store.

No more infrastructure roles for Windows Nano Server, and why I still like Server Core

Microsoft’s General Manager for Windows Server Erin Chapple posted last week about Nano Server (under a meaningless PR-speak headline) to explain that Nano Server, the most stripped-down edition of Windows Server, is being repositioned. When it was introduced, it was presented not only as a lightweight operating system for running within containers, but also for infrastructure roles such as hosting Hyper-V virtual machines, hosting containers, file server, web server and DNS Server (but without AD integration).

In future, Nano Server will be solely for the container role, enabling it to shrink in size (for the base image) by over 50%, according to Chapple. It will no longer be possible to install Nano Server as a standalone operating system on a server or VM. 

This change prompted Microsoft MVP and Hyper-V enthusiast Aidan Finn to declare Nano Server all but dead (which I suppose it is from a Hyper-V perspective) and to repeat his belief that GUI installs of Windows Server are best, even on a server used only for Hyper-V hosting.

Prepare for a return to an old message from Microsoft, “We recommend Server Core for physical infrastructure roles.” See my counter to Nano Server. PowerShell gurus will repeat their cry that the GUI prevents scripting. Would you like some baloney for your sandwich? I will continue to recommend a full GUI installation. Hopefully, the efforts by Microsoft to diminish the full installation will end with this rollback on Nano Server.

Finn’s main argument is that the full GUI makes troubleshooting easier. Server Core also introduces a certain amount of friction as most documentation relating to Windows Server (especially from third parties) presumes you have a GUI and you have to do some work to figure out how to do the same thing on Core.

Nevertheless I like Server Core and use it where possible. The performance overhead of the GUI is small, but running Core does significantly reduce the number of security patches and therefore required reboots. Note that you can run GUI applications on Server Core, if they are written to a subset of the Windows API, so vendors that have taken the trouble to fix their GUI setup applications can support it nicely.

Another advantage of Server Core, in the SMB world where IT policies can be harder to enforce, is that users are not tempted to install other stuff on their Server Core Domain Controllers or Hyper-V hosts. I guess this is also an advantage of VMWare. Users log in once, see the command-line UI, and do not try installing file shares, print managers, accounting software, web browsers (I often see Google Chrome on servers because users cannot cope with IE Enhanced Security Configuration), remote access software and so on.

Only developers now need to pay attention to Nano Server, but that is no reason to give up on Server Core.