Microsoft’s Windows 10 October 2018 update on hold after some users suffer deleted documents: what to conclude?

Microsoft has paused the rollout of the October 2018 Windows update for Windows 10 while it investigates reports of users losing data after the upgrade.

image

Update: Microsoft’s “known issues” now asks affected uses to “minimize your use of the affected device”, suggesting that file recovery tools are needed for restoring documents, with uncertain results.

Windows 10, first released in July 2015, was the advent of “Windows as a service.” It was a profound change. The idea is that whether in business or at home, Windows simply updates itself from time to time, so that you always have a secure and up to date operating system. Sometimes new features arrive. Occasionally features are removed.

Windows as a service was not just for the benefit of we, the users. It is vital to Microsoft in its push to keep Windows competitive with other operating systems, particularly as it faces competition from increasingly powerful mobile operating systems that were built for the modern environment. A two-year or three-year upgrade cycle, combined with the fact that many do not bother to upgrade, is too slow.

Note that automatic upgrade is not controversial on Android, iOS or Chrome OS. Some iOS users on older devices have complained of performance problems, but in general there are more complaints about devices not getting upgraded, for example because of Android operators or vendors not wanting the bother.

Windows as a service has been controversial though. Admins have worried about the extra work of testing applications. There is a Long Term Servicing Channel, which behaves more like the old 2-3 year upgrade cycle, but it is not intended for general use, even in business. It is meant for single-purpose PCs such as those controlling factory equipment, or embedded into cash machines.

Another issue has been the inconvenience of updates. “Restart now” is not something you want to see just before giving a presentation, or working on it at the last minute, for example. Auto-restart occasionally loses work if you have not saved documents.

The biggest worry though is the update going wrong. For example, causing a PC to become unusable. In general this is rare. Updates do fail, but Windows simply rolls back to the previous version, annoying but not fatal.

What about deleting data? Again it is rare; but in this case recovery is not simple. You are in the realm of disk recovery tools, if you do not have a backup. However it turns out that users have reported updates deleting data for some time. Here is one from 4 months ago:

image

Why is the update deleting data? It is not yet clear, and there may be multiple reasons, but many of the reports I have seen refer to user documents stored outside the default location (C:\users\[USERNAME]\). Some users with problems have multiple folders called Documents. Some have moved the location the proper way (Location tab in properties of special folders like Documents, Downloads, Music, Pictures) and still had problems.

Look through miglog.xml though (here is how to find it) and you will find lots of efforts to make sense of the user’s special folder layout. This is not my detailed diagnosis of the issue, just an observation having ploughed through long threads on Reddit and elsewhere; of course these threads are full of noise.

Here is an example of a user who suffered the problem and had an unusual setup: the location of his special folders had been moved (before the upgrade) to an external drive, but there was still important data in the old locations.

We await the official report with interest. But what can we conclude, other than to take backups (which we knew already)?

Two things. One is that Microsoft needs to do a better job of prioritising feedback from its Insider hub. Losing data is a critical issue. The feedback hub, like the forums, is full of noise; but it is possible to identify critical issues there.

This is related of course to the suspicion that Microsoft is now too reliant on unpaid enthusiast testers, at the expense of thorough internal testers. Both are needed and both, I am sure, exist. What though is the proportion and has internal testing been reduced on the basis of these widespread public betas?

The second thing is about priorities. There is a constant frustration that vendors (and Microsoft is not alone) pay too much attention to cosmetics and new features, and not enough to quality and fixing long-standing bugs and annoyances.

What do most users do after Windows upgrades? They are grateful that Windows is up and running again, and go back to working in Word and Excel. They do not care about cosmetic changes or new features they are unlikely to use. They do care about reliability. Such users are not wrong. They deserve better than to find documents missing.

One final note. Microsoft released Windows 10 1809 on 2nd October. However the initial rollout was said to be restricted to users who manually checked Windows Update or used the Update Assistant. Microsoft said that automatic rollout would not begin until Oct 9th. In my case though, on one PC, I got the update automatically (no manual check, no Insider Build setting) on October 3rd. I have seen similar reports from others. I got the update on an HP PC less than a year old, and my guess is that this is the reason:

With the October 2018 Update, we are expanding our use of machine learning and intelligently selecting devices that our data and feedback predict will have a smooth update experience.

In other words, my PC was automatically selected to give Microsoft data on upgrades expected to go smoothly. I am guessing though. I am sure I did not trigger the update myself, since I was away all day on the 2nd October, and buried in work on the 3rd when the update arrived (I switched to a laptop while it updated). I did not lose data, even though I do have a redirected Documents folder. I did see one anomaly: my desktop background was changed from blue to black, and I had to change it back manually.

What should you do if you have this problem and do not have backups? Microsoft asks you to call support. As far as I can tell, the files really are deleted so there will not be an easy route to recovery. The best chance is to use the PC as little as possible; do a low-level copy of the hard drive if you can. Shadow Copy Explorer may help. Another nice tool is Zero Assumption Recovery. What you recover is dependent on whether files have been overwritten by other files or not.

Update: Microsoft has posted an explanation of why the data loss occurred. It’s complicated and all do to with folder redirection (with a dash of OneDrive sync). It affected some users who redirected “known folders” like Documents to another location. The April 2018 update created spurious empty folders for some of these users. The October 2018 update therefore sought to delete them, but in doing so also deleted non-empty folders. It still looks like a bad bug to me: these were legitimate folders for storing user data and should not have been removed if not empty.

More encouraging is that Microsoft has made some changes to its feedback hub so that users can “provide an indication of impact and severity” when reporting issues. The hope is that Microsoft will find reports of severe bugs more easily and therefore take action.

Updated 8th Oct to remove references to OneDrive Sync and add support notes. Updated 10th Oct with reference to Microsoft’s explanatory post.

Review: Synology DS119J. Great system but single bay and underpowered hardware make it worth spending a bit more

Synology has released a new budget NAS, the DS119j, describing it as “An ideal first NAS for the home".

It looks similar to the DS115j which it probably replaces – currently both models are listed on Synology’s site. What is the difference? The operating system is now 64-bit, the CPU now a dual-core ARMv8, though still at 800 MHz, and the read/write performance slightly bumped from 100 MB/s to 108 MB/s, according to the documentation.

I doubt any of these details will matter to the intended users, except that the more powerful CPU will help performance – though it is still underpowered, if you want to take advantage of the many applications which this device supports.

image

What you get is the Diskstation, which is a fairly slim white box with connections for power, 1GB Ethernet port, and 2 USB 2.0 ports. Disappointing to see the slow USB 2.0 standard used here. You will also find a power supply, an Ethernet cable, and a small bag of screws.

image

The USB ports are for attaching USB storage devices or printers. These can then be accessed over the network.

The DS119j costs around £100.

Initial setup

You can buy these units either empty, as mine was, or pre-populated with a hard drive. Presuming it is empty, you slide the cover off, fit the 3.5" hard drive, secure it with four screws, then replace the cover and secure that with two screws.

What disk should you buy? A NAS is intended to be always on and you should get a 3.5" disk that is designed for this. Two common choices are the WD (Western Digital) Red series, and Seagate IronWolf series. At the time of writing, both a 4TB WD Red and a 4TB IronWolf are about £100 from Amazon UK. The IronWolf Pro is faster and specified for a longer life (no promises though), at around £150.

What about SSD? This is the future of storage (though the man from Seagate at Synology’s press event says hard drives will continue for a decade or more). SSD is much faster but on a home NAS that is compromised by accessing it over a network. It is much more expensive for the same amount of storage. You will need a SATA SSD and a 3.5" adapter. Probably not the right choice for this NAS.

Fitting the drive is not difficult, but neither it is as easy as it could be. It is not difficult to make bays in which drives can be securely fitted without screws. Further, the design of the bay is such that you have to angle a screwdriver slightly to turn the screws. Finally, the screw holes in the case are made entirely of plastic and it would be easy to overtighten then and strip the thread, so be careful.

Once assembled, you connect the drive to a wired network and power it on. In most home settings, you will attach the drive to a network port on your broadband router. In other cases you may have a separate network switch. You cannot connect it over wifi and this would anyway be a mistake as you need the higher performance and reliability of a cable connection.

To get started you connect the NAS to your network and therefore to the internet, and turn it on. In order to continue, you need to find it on the network which you can do in one of several ways including:

– Download the DS Finder app for Android or iOS.

– Download Synology Assistant for Windows, Mac or Linux

– Have a look at your DHCP manager (probably in your router management for home users) and find the IP address

If you use DS Finder you can set up the Synology DiskStation from your phone. Otherwise, you can use a web browser (my preferred option). All you need to do to get started is to choose a username and password. You can also choose whether to link your DiskStation with a Synology account and create a QuickConnect ID for it. If you do this, you will be able to connect to your DiskStation over the Internet.

The DiskStation sets itself up in a default configuration. You will have network folders for music, photo, video, and another called home for other documents. Under home you will also find Drive, which behaves like a folder but has extra features for synchronization and file sharing. For full use of Drive, you need to install a Drive client from Synology.

image

If you attach a USB storage device to a port on the DS119j, it shows up automatically as usbshare1 on the network. This means that any USB drive becomes network storage, a handy feature, though only at USB 2.0 speed.

Synology DSM (Disk Station Manager)

Synology DSM is a version of Linux adapted by Synology. It is mature and robust, now at version 6.2. The reason a Synology NAS costs much more than say a 4TB WD Elements portable USB drive is that the Synology is actually a small server, focused on storage but capable of running many different types of application. DSM is the operating system. Like most Linux systems, you install applications via a package manager, and Synology maintains a long list of packages encompassing a diverse range of functions from backup and media serving through to business-oriented applications like running Java applications, a web server, Docker containers, support ticket management, email, and many more.

DSM also features a beautiful windowed user interface all running in the browser.

image

The installation and upgrade of packages is smooth and whether you consider it as a NAS, or as a complete server system for small businesses, it is impressive and (compared to a traditional Windows or Linux server) easy to use.

The question in relation to the DS119j is whether DSM is overkill for such a small, low-power device.

Hyper Backup

Given that this NAS only has a single drive, it is particularly important to back up any data. Synology includes an application for this purpose, called Hyper Backup.

image

Hyper Backup is very flexible and lets you backup to many destinations, including Amazon S3, Microsoft Azure, Synology’s own C2 cloud service, or to local storage. For example, you could attach a large USB drive to the USB port and backup to there. Scheduling is built in.

I had a quick look at the Synology C2 service. It did not go well. I use the default web browser on Windows 10, Edge, and using Hyper Backup to Synology C2 just got me this error message.

image

I told Edge to pretend to be Firefox, which worked fine. I was invited to start a free trial. Then you get to choose a plan:

image

Plans start at €9.99 + VAT for 100GB for a year. Of course if you fill your 4TB drive that will not be enough. On the other hand, not everything needs to be backed up. Things like downloads that you can download again, or videos ripped from disks, are not so critical, or better backed up to local drives. Cloud backup is ideal though for important documents since it is an off-site backup. I have not compared prices, but I suspect that something like Amazon S3 or Microsoft Azure would be better value than Synology C2, though integration will be smooth with Synology’s service. Synology has its own datacentre in Frankfurt so it is not just reselling Amazon S3; this may also help with compliance.

An ideal first NAS?

The DS119j is not an ideal NAS for one simple reason: it has only a single bay so does not provide resilient storage. In other words, you should not have data that is stored only on this DiskStation, unless it is not important to you. You should ensure that it is backed up, maybe to another NAS or external drive, or maybe to cloud storage.

Still, if you are aware of the risks with a single drive NAS and take sensible precautions, you can live with it.

I like Synology DSM which makes the small NAS devices great value as small servers. For home users, they are great for shared folders, media serving (I use Logitech Media Server with great success), and PC backup. For small business, they are a strong substitute for the role which used to be occupied by Microsoft’s Small Business Server as well as being cheaper and easier to use.

If you only want a networked file share, there are cheaper options from the likes of Buffalo, but Synology DSM is nicer to use.

If you want to make fuller use of DSM though, this model is not the best choice. I noticed the CPU often spiked just using the control panel and package manager.

image

I would suggest stretching to at least the DS218j, which is similar but has 2 bays, 500MB of RAM and a faster CPU. Better still, I like the x86-based Plus series – but a 2-bay DS218+ is over £300. A DS218j is half that price and perhaps the sweet spot for home users.

Finally, Synology could do better with documentation for the first-time user. Getting started is not too bad, but the fact is that DSM presents you with a myriad of options and applications and a better orientation guide would be helpful.

Conclusion? OK, but get the DS218j if you can.

Linux applications and .NET Core on a Chromebook makes this an increasingly interesting device

I have been writing about Google Chromebooks of late and as part of my research went out and bought one, an HP Chromebook 14 that cost me less than £200. It runs an Intel Celeron N3350 processor and has a generous (at this price) 32GB storage; many of the cheaper models have only 16GB.

This is a low-end notebook for sure, but still boots quickly and works fine for general web browsing and productivity applications. Chrome OS (the proprietary version of the open source Chromium OS) is no longer an OS that essentially just runs Google’s Chrome browser, though that is still the main intent. It has for some time been able to run Android applications; these run in a container which itself runs Android. Android apps run fairly well though I have experienced some anomalies.

Recently Google has added support for Linux applications, though this is still in beta. The main motivation for this seems to be to run Android Studio, so that Googlers and others with smart Pixelbooks (high-end Chromebooks that cost between £999 and £1,699) can do a bit more with their expensive hardware.

I had not realised that even a lowly HP Chromebook 14 is now supported by the beta, but when I saw the option in settings I jumped at it.

image

It took a little while to download but then I was able to open a Linux terminal. Like Android, Linux runs in a container. It is also worth noting that Chrome OS itself is based on Linux so in one sense Chromebooks have always run Linux; however they have been locked down so that you could not, until now, install applications other than web apps or Android.

Linux is therefore sandboxed. It is configured so that you do not have access to the general file system. However the Chromebook Files application has access to your user files in both Chrome OS and Linux.

image

I found little documentation for running Linux applications so here are a few notes on my initial stumblings.

First, note that the Chromebook trackpad has no right-click. To right-click you do Alt-Click. Useful, because this is how you paste from the clipboard into the Linux terminal.

Similarly, there is no Delete key. To Delete you do Alt-Backspace.

I attribute these annoyances to the fact that Chrome OS was mostly developed by Mac users.

Second, no Linux desktop is installed. I did in fact install the lightweight LXDE with partial success but it does not work properly.

The idea is that you install GUI applications which run in their own window. It is integrated so that once installed, Linux applications appear in the Chromebook application menu.

I installed Firefox ESR (Extended Support Release).  Then I installed an application which promises to be particularly useful for me, Visual Studio Code. Next I installed the .NET Core SDK, following the instructions for Debian.

image

Everything worked, and after installing the C# extension for VS Code I am able to debug and run .NET Core applications.

I understand that you will not be so lucky with VS Code if you have an ARM Chromebook. Intel x86 is the winner for compatibility.

What is significant to me is not only that you can now run desktop applications on a Chromebook, but also that you can work on a Chromebook without needing to be deeply hooked into the Google ecosystem. You still need a Google account of course, for log in and the Play Store.

You will also note from the screenshot above that Chrome OS is no longer just about a full-screen web browser. Multiple overlapping windows, just like Windows and Mac.

These changes might persuade me to spend a little more on a Chromebook next time around. Certainly the long battery life is attractive. Following a tip, I disabled Bluetooth, and my Chromebook battery app is reporting 48% remaining, 9 hrs 23 minutes. A little optimistic I suspect, but still fantastic.

Postscript: I was always a fan of the disliked Windows RT, which combined a locked-down operating system with the ability to run Windows applications. Maybe container technology is the answer to the conundrum of how to provide a fully capable operating system that is also protected from malware. Having said which, there is no doubt that these changes make Chromebooks more vulnerable to malware; even if it only runs in the Linux environment, it could be damaging and steal data. The OS itself though will be protected.

Microsoft Azure Stack: a matter of compliance

At the Ignite conference last week in Orlando, Microsoft’s hardware partners were showing off their latest Azure Stack boxes.

In conversation, one mentioned to me that Azure Stack was selling better in Europe than in the USA. Why? Because stricter compliance regulations (perhaps alongside the fact that the major cloud platforms are all based in North America) makes Azure Stack more attractive in Europe.

image
Lenovo’s Azure Stack

Azure Stack is not just “Azure for your datacentre”. It is a distinctive way to purchase IT infrastructure, where you buy the hardware but pay for the software with a usage-based model.

Azure / Azure Stack VMs are resilient so you cannot compare the value directly with simply running up a VM on your own server. Azure Stack is a premium option. The benefits are real. Microsoft mostly looks after the software, you can use the excellent Azure management tools, and you get deep integration with Azure in the cloud. Further, you can diminish the cost by scaling back at times of low demand; especially easy if you use abstracted services such as App Service, rather than raw VMs.

How big is the premium? I would be interested to hear from anyone who has done a detailed comparison, but my guess is that running your own servers with Windows Server Datacenter licenses (allowing unlimited VMs once all the cores are licensed) is substantially less expensive.

You can see therefore that there is a good fit for organizations that want to be all-in on the cloud, but need to run some servers on-premises for compliance reasons.