All posts by Tim Anderson

A glimpse into Microsoft history which goes some way to explaining the decline of Windows

Why is Windows in decline today? Short answer: because Microsoft lost out and/or gave up on Windows Phone / Mobile.

But how did it get to that point? A significant part of the story is the failure of Longhorn (when two to three years of Windows development was wasted in a big reset), and the failure of Windows 8.

In fact these two things are related. Here’s a post from Justin Chase; it is from back in May but only caught my attention when Jose Fajardo put it on Twitter. Chase was a software engineer at Microsoft between 2008 and 2014.

Chase notes that Internet Explorer (IE) stagnated because many of the developers working on it switched over to work on Windows Presentation Foundation, one of the “three pillars” of Longhorn. I can corroborate this to the extent that I recall a conversation with a senior Microsoft executive at Tech Ed Europe, in pre-Longhorn days, when I asked why not much was happening with IE. He said that the future lay in rich internet-connected applications rather than browser applications. Insightful perhaps, if you look at mobile apps today, but no doubt Microsoft also had in mind locking people into Windows.

WPF, based on .NET and DirectX, was intended to be used for the entire Windows shell in Longhorn. It was too slow, memory hungry, and buggy, eventually leading to the Longhorn reset.

“Ever since Longhorn the Windows team has had an extremely bitter attitude towards .NET. I don’t think its completely fair as they essentially went all in on a brand new technology and .NET has done a lot of evolving since then but nonetheless that sentiment remains among some of the now top players in Microsoft. So effectively there is a sentiment that some of the largest disasters in Microsoft history (IE’s fall from grace and multiple “bad” versions of Windows) are, essentially, totally the fault of gambling on .NET and losing (from their perspective). “

writes Chase.

This went on to impact Windows 8. You will recall that Windows Phone development was once based on Silverlight. Windows 8 however did not use Silverlight but instead had its own flavour of XAML. At the time I was bemused that Microsoft, with an empty Windows 8 app store, had not enabled compatibility with Windows Phone applications which would have given Windows 8 a considerable boost as well as helping developers port their code. Chase explains:

“So when Microsoft went to make their new metro apps for windows 8/10, they almost didn’t even support XAML apps but only C++ and JavaScript. It was only the passion of the developer community that pushed it over the edge and let it in.”

That was a shame because Silverlight was a great bit of technology, lightweight, powerful, graphically rich, and even cross-platform to some extent. If Microsoft had given developers a consistent and largely compatible path from Silverlight to Windows Phone to Windows 8 to Windows 10, rather than the endless changes of direction that happened instead, its modern Windows development platform would be stronger. Perhaps, even, Windows Phone / Mobile would not have been abandoned; and we would not have to choose today between the Apple island and the ad-driven Android.

The end of the Edge browser engine. Another pivotal moment in Microsoft’s history

Microsoft’s Joe Belfiore has announced that future versions of its Edge web browser will be built on Chromium. Chromium is an open source browser project originated by Google, which uses it for Chrome. The browser engine is Blink, which was forked from WebKit in April 2013.

image

Belfiore does not specify what will happen to Chakra, the JavaScript engine used by Edge, but it seems likely that future versions of Edge will use the Chrome V8 engine instead.

There is plenty of logic behind the move. The immediate benefit to Microsoft in having its own browser engine is rather small. Chromium-based Edge will still have Microsoft’s branding and can still have unique features. It opens an easy route to cross-platform Edge, not only for Android, but also for MacOS and potentially Linux. It will improve web compatibility because all web developers know their stuff has to run properly in Chrome.

This is still a remarkable moment. The technology behind Edge goes right back to Trident, the Internet Explorer engine introduced in 1997. In the Nineties, winning the browser wars was seen as crucial to the future of the company, as Microsoft feared that users working mostly in the browser would no longer be hooked to Windows.

Today those fears have somewhat come to pass; and Windows does indeed face a threat, especially from Chrome OS for laptops, and of course from iOS and Android on mobile, though it turns out that internet-connected apps are just as important. Since Microsoft is not doing too well with its app store either, there are challenges ahead for Microsoft’s desktop operating system.

The difference is that today Microsoft cares more about its cloud platform. Replacing a Windows-only building block with a cross-platform one is therefore strategically more valuable than the opportunity to make Edge a key attraction of Windows, which was in any case unsuccessful.

The downside though (and it is a big one) is that the disappearance of the Edge engine means there is only Mozilla’s Gecko (used by Firefox), and WebKit, used by Apple’s Safari browser, remaining as mainstream alternatives to Chromium. Browser monoculture is drawing closer then, though the use of open source lessens the risk that any one company (it would be Google in this instance) will be able to take advantage.

Internet Explorer was an unhealthy monoculture during its years of domination, oddly not because of all its hooks to Windows, but because Microsoft stagnated its development in order to promote its Windows-based application platform (at least, that is my interpretation of what happened).

Let me add that this is a sad moment for the Edge team. I like Edge and there was lots of good work done to make it an excellent web browser.

State of Microsoft .NET: transition to .Net Core or be left behind

The transition of Microsoft’s .NET platform from Windows-only to cross-platform (and open source) is the right thing. Along with Xamarin (.NET for mobile platforms), it means that developers with skills in C#, F# and Visual Basic can target new platforms, and that existing applications can with sufficient effort be migrated to Linux on the server or to mobile clients.

That does not mean it is easy. Microsoft forked .NET to create .NET Core (it is only four years since I wrote up one of the early announcements on The Register) and the problem with forks is that you get divergence, making it harder to jump from one fork to the other.

At first this was disguised. The idea was that .NET Framework (the old Windows-only .NET) would be evolved alongside .NET Core and new language features would apply to both, at least initially. In addition, ASP.NET Core (the web framework) runs on either .NET Framework or .NET Core.

This is now changing. Microsoft has shifted its position so that .NET Framework is in near-maintenance mode and that new features come only to .NET Core. Last month, Microsoft’s Damian Edwards stated that ASP.NET Core will only run on .NET Core starting from 3.0, the next major version.

This week Mads Torgersen, C# Program Manager, summarised new features in the forthcoming C# 8.0. Many of these features will only work on .NET Core:

Async streams, indexers and ranges all rely on new framework types that will be part of .NET Standard 2.1. As Immo describes in his post Announcing .NET Standard 2.1, .NET Core 3.0 as well as Xamarin, Unity and Mono will all implement .NET Standard 2.1, but .NET Framework 4.8 will not. This means that the types required to use these features won’t be available when you target C# 8.0 to .NET Framework 4.8.

Default interface member implementations rely on new runtime enhancements, and we will not make those in the .NET Runtime 4.8 either. So this feature simply will not work on .NET Framework 4.8 and on older versions of .NET.

The obvious answer is to switch to .NET Core. Microsoft is making this more feasible by supporting WPF and Windows Forms with .NET Core, on Windows only. Entity Framework 6 will also be supported.  It is also likely that this will work on Windows 7 as well as Windows 10.

This move will not be welcome to all developers. The servicing for .NET Framework is automatic, via Windows Update or on-premises equivalents, but for .NET Core requires developer attention. Inevitably some things will not work quite the same on .NET Core and for long-term stability it may be preferable to stay with .NET Framework. The more rapid release cycle of .NET Core is not necessarily a good thing if you prioritise reliability over new features.

The problem though: from now on, .NET Framework will not evolve much. There are a few new things in .NET Framework 4.8, like high DPI support, Edge-based browser control, and better touch support. There are really minimal essential updates. In time, maintaining applications on .NET Framework will look like a mistake as application capabilities and performance fall behind. That means, if you are a .NET developer, .NET Core is in your future.

From Big Blue to Big Red? IBM to acquire Red Hat

image

IBM has agreed to acquire Red Hat:

IBM will acquire all of the issued and outstanding common shares of Red Hat for $190.00 per share in cash, representing a total enterprise value of approximately $34 billion.

IBM Is presenting this as a hybrid cloud play, with the claim that businesses are held back from cloud migration “by the proprietary nature of today’s cloud market.”

IBM and Red Hat will be strongly positioned to address this issue and accelerate hybrid multi-cloud adoption. Together, they will help clients create cloud-native business applications faster, drive greater portability and security of data and applications across multiple public and private clouds, all with consistent cloud management. In doing so, they will draw on their shared leadership in key technologies, such as Linux, containers, Kubernetes, multi-cloud management, and cloud management and automation.

Notably, the announcement specifically refers to multi-cloud adoption, and that the company intends to “build and enhance” partnerships with Amazon Web Services (AWS), Microsoft Azure, Google Cloud and Alibaba.

Red Hat will be a “distinct unit” within IBM, the intention being to preserve its open source culture and independence.

My own instinct is that we will see more IBM influence on Red Hat, than Microsoft influence on GitHub, to take another recent example of an established tech giant acquiring a company with an open source culture.

IBM is coming from behind in the cloud wars, but with Linux ascendant, and Red Hat the leader in enterprise Linux, the acquisition gives the company a stronger position in today’s technology landscape.

Microsoft is making lots of money. Anything else notable in its first quarter financials?

Microsoft has released its statements for the first quarter in its financial year, ending 30th September. Here is the segment breakdown. Everything has moved in the right direction.

Quarter ending September 30th 2018 vs quarter ending September 30th 2017, $millions

Segment Revenue Change Operating income Change
Productivity and Business Processes 9771 +1533 3881 +875
Intelligent Cloud 8567 +1645 2931 +794
More Personal Computing 10746 +1368 3143 +578

The segments break down as:

Productivity and Business Processes: Office, Office 365, Dynamics 365 and on-premises Dynamics, LinkedIn

Intelligent Cloud: Server products, Azure cloud services

More Personal Computing: Consumer including Windows, Xbox; Bing search; Surface hardware

Any points of interest? In his earnings call statement, CEO Satya Nadella talked Teams, the Office 365 conferencing and collaboration solution:

“Teams is now the hub for teamwork for 329,000 organizations, including 87 of the Fortune 100. And, we are adding automated translation
support for meetings, shift scheduling for firstline workers, and new industry-specific offerings including healthcare and small business.”

He also mentioned Power Apps and Flow, interesting to me because they are the most successful so far of the company’s efforts to come up with a low-code development platform:

“Power BI, Power Apps and Flow are driving momentum with customers and have made us a leader in no-code app building and business analytics in the cloud.”

He also mentioned the pending GitHub acquisition, which he says is “an opportunity to bring our tools and services to new audiences while enabling GitHub to grow and retain its developer-first ethos.”

Note that despite the cloud growth, Windows remains the biggest single segment in terms of revenue.

Determining how much of Microsoft’s business is “cloud” is tricky. The figures in the productivity segment lump together Office 365 and on-premises products, while Office 365 itself is in part a subscription to desktop Office, so not pure cloud. Equally, the “intelligent cloud” segment includes on-premises server licenses. No doubt this fuzzing of what is and is not cloud in the figures is deliberate.

Windows on a Chromebook? How containers change everything

Apparently there are rumours concerning Windows on a Chromebook. I find this completely plausible, though unlike Barry Collins I would not recommend dual boot – always a horrible solution.

Rather, when I recently explored about Chromebooks and Chrome OS, it was like the proverbial lightbulb illuminating in my head. Containers (used to implement Linux and Android on Chrome OS) change everything. It makes total sense: a secure, locked-down base operating system, and arbitrary applications running in isolated containers on top.

Could Chrome OS run Windows in a container? Not directly, since containers are isolated from the host operating system but share its base files and resources. However you could run Windows in a VM on a Chromebook, and with a bit of integration work this could be relatively seamless for the user. Systems like Parallels do this trick on MacOS. Instead of the wretched inconvenience of dual boot, you could run a Linux app here, and a Windows app there, and everything integrates nicely together.

Microsoft could also re-engineer Windows along these lines. A lot of the work is already done. Windows supports containers and you can choose the level of isolation, with either lightweight containers or containers based on Hyper-V. It also supports Linux containers, via Hyper-V. Currently this is not designed for client applications, but for non-visual server applications, but his could change. It is also possible to run Linux containers on the Windows Subsystem for Linux, though not currently supported.

Windows RT failed for a few reasons: ARM-only, underpowered hardware, Windows 8 unpopularity, and most of all, inability to run arbitrary x86 Windows applications.

A container-based Windows could have the security and resilience of Windows RT, but without these limitations.

So I can imagine Google giving us the ability to run virtual Windows on Chrome OS. And I can imagine Microsoft building a future version of Windows in which you can run both Windows and Linux applications in isolated environments.

Linux Foundation Open Source Summit opens in Edinburgh: Microsoft praised for “facing reality”

The Linux Foundation Open Source Summit kicked off in Edinburgh today, with Executive Director Jim Zemlin declaring that the organization is now adding a new member daily. The Linux Foundation oversees over 150 open source projects, including Linux, Kubernetes, Let’s Encrypt, Cloud Foundry and Cloud Native Computing Foundation, and has over 1320 members.

image

Microsoft has been a member for several years, but has now also signed up to the Open Invention Network (OIN), promising patent non-aggression to other licensees. It is a significant move which has boosted both the OIN and the Linux Foundation.

image
Linux Foundation Executive Director Jim Zemlin

Keith Bergelt,, CEO of OIN, took the stage to congratulate Microsoft on “facing the reality of the world as it is.”

Another important recent event is the statement by Linus Torvalds, in which he apologises for brusque behaviour and says he is taking some time off Linux kernel development:

“I need to change some of my behavior, and I want to apologize to the people that my personal behavior hurt and possibly drove away from kernel development entirely. I am going to take time off and get some assistance on how to understand people’s emotions and respond appropriately.”

What are the implications for Linux? Nobody known; though LWN’s Jonathan Corbet spoke at this morning’s keynote to assure us that a new code of conduct in which kernel developers promise to be nicer to each other will be a good thing.

I interviewed Zemlin today and will post more from the event soon.

Bob Dylan’s Mondo Scripto work and exhibition in London

If you are in London before 30 November 2018, and any kind of Bob Dylan fan, then I highly recommend Mondo Scripto, an exhibition of his drawings and Iron Works gates.

image

Let’s start with the Iron Works. Way back in 2001 Dylan began constructing gates from scrap metal, as gifts for friends or for his own property. “Gates appeal to me because of the negative space they allow. They can be closed but at the same time they allow the seasons and breezes to flow. They can shut you out or shut you in. And in some ways there is no difference,” he says.

Iron is part of Dylan’s history because he was born and brought up in Hibbing, Minnesota, Iron country and near a large open pit mine. So for Dylan it is a return to roots as well as another way to exercise his creativity. His Iron Works were exhibited at the Halycon Gallery in London in 2013, and you can see them now peppered around the display of drawings and paintings in the current exhibition.

The gates look strong but quirky, symmetrical in some ways but not in others, painted in subtle shades that bring out the metalness and variety. There are gear wheels, chains, spanners at odd angles, animal shapes, pliers, roller skates, wheels and springs all jumbled together but making a cohesive whole. Somewhat like the way he uses language in his songs.

image 

But what of Mondo Scripto? Dylan has taken 64 songs, written out the lyrics in blocks (so they are not particularly easy to read), and illustrated each one with a drawing. For example, here is one that I like, Just Like a Woman:

image

Maybe you would expect to see a woman in the drawing; but no, this is the line “as I stand inside the rain.”

Note that the exhibits at the Halcyon Gallery are the actual drawings, not just the signed prints you can buy for £1895 each (10 songs, limited editions of 495 for each song).

I am not much interested in the collector’s aspect here. I think that is a lot of money for a print and a signature. The orginals also look much sharper and better than the prints, which is disappointing if you have the print. You may be able to negotiate to buy an original but it will cost a lot more. I would quite like one of my favourites on the wall, or a Dylan gate in the garden, but the cost is too steep for me; if I were wealthy enough for it to be spare change, perhaps I would. Then again, the profits are enabling this amazing exhibition which is free to attend so it is not so bad.

What I am interested in is the choice of songs, the choice of images and the way they are executed, and little details like small changes in the lyrics. For example, Ballad of a Thin Man in the original:

There ought to be a law against you comin’ around
You should be made to wear earphones

and in Mondo Scripto:

There ought to be a law against you coming around
Next time don’t forget to first telephone

I was intrigued to see that in Subterranean Homesick Blues, Dylan has written out:

don’t try ‘No Doze’

which is the latest chapter in a story; the original is “don’t tie no bows” but via humour and mis-transcription has become what it is. Maybe Dylan just copied it, maybe he likes it better now, who knows?

This is Dylan so there is variation everywhere. You can get a nice book/catalogue of Mondo Scripto for £45 (this one is a good buy), and this includes 60 songs written out with their drawings. However many of the drawings are different to those in the exhibition. Apparently the book was done first so those are the earlier versions. Just like Tom Thumb’s Blues, for example, has a bottle of wine in the book (“I started out on Burgundy”) and a cityscape in the exhibition (“Back to New York City”?).

Knockin’ on Heaven’s Door has special treatment. This has been done as a series of 16 drawings, pen drawings rather than pencil sketches, including a variety of different doors and techniques for knockin’ on them. It begins with a hand knock, and ends with a rap from a cross. There is also a drill, a crowbar, a bottle, and so on.

Lovely humour, but also a meditation on death? Possibly, though Dylan has been singing about death at least since Fixin’ to Die on his very first album, so we should not take this as any kind of final statement.

It is nevertheless true that Mondo Scripto is Dylan’s reflection on his best-known songs, made new by drawings which bring out a striking image or thought and which reminds you how extraordinary they are.

One of the features of this beautifully laid-out exhibition is a wall of books, some by but mostly about Dylan.

image

It is a reminder of how many of us have been entertained, absorbed and challenged by this body of work.

Mondo Scripto along with the Iron Works are remarkable, coming from a man who also tours incessantly and is of an age where many of us sit around doing nothing much at all.

image

The Mondo Scripto song list

1. Song To Woody
2. Blowin’ in The Wind
3. Girl From The North country
4. Don’t Think Twicse, It’s All Right
5. Masters Of war
6. One Too Many Mornings
7. A Hard Rain’s A.Gonna Fall
8. Oxford Town
9. Tombstone Blues
10. Desolation Row
11. It’s All Over Now, Baby Blue
12. It’s Alright, Ma (I’m Only Bleeding)
13. Like A Rolling Stone
14. Mr Tambourine Man
15. It Ain’t Me, Babe
16. Ballad Of A Thin Man
17. Just Like Tom Thumb’s Blues
18. The Times They Are A-Changin’
19. Ballad Of A Thin Man*
20. All 1 Really Want Tc Do
21. Rainy Day Women #12 & 35
22. 1 Want You
23. Highway 61 Revisited
24. Leopard-Skin Pill-Bax Hat
25. Stuck Inside Of Mobile With The Memphis Blues Again
26. Visions Of Johanna
27. One Of Us Must Kuow {Sooner Or Later)
28. Subterranean Homesick Blues
29. She Belongs To Me
30. Maggie’s Farm
31. Love Minus Zero/No Limit
32. Just Like A Woman
33. Chimes Of Freedom
34. Positively 4th Street
35. Tangled Up In Blue
36. “Knockin’ On Heaven’s Door” series
37. All Along The WaLchLower
38. Lay, Lady, Lay
39. The Man In Me
40. Tomorrow Is A Long Time
41. When I Paint My Masterpiece
42. I Shall Be Released
43. Forever Young
44. Knockin’ On Heaven’s Door
45. If You See Her, Say Hello
46. One More Cup Of Coffee(Valley Below)
47. “Subterranean Homesick Blues” series
48. Shelter From The Storm
49. Simple Twist of Fate
50. You’re Gonna Make Me Lonesome When You Go
51. Gotta Serve Somebody
52. Isis
53. Jokerman
54. Every Grain Of Sand
55. Hurricane
56. This Wheel’s On Fire
57. Man Tn the Long Black Coat
58. Isis
59. Things Have Changed
60. Workingman’s Blues #2
61. Mississippi
62. Ain’t Talkin’
63. Highlands
64. Make You Feel My Love


Microsoft’s Windows 10 October 2018 update on hold after some users suffer deleted documents: what to conclude?

Microsoft has paused the rollout of the October 2018 Windows update for Windows 10 while it investigates reports of users losing data after the upgrade.

image

Update: Microsoft’s “known issues” now asks affected uses to “minimize your use of the affected device”, suggesting that file recovery tools are needed for restoring documents, with uncertain results.

Windows 10, first released in July 2015, was the advent of “Windows as a service.” It was a profound change. The idea is that whether in business or at home, Windows simply updates itself from time to time, so that you always have a secure and up to date operating system. Sometimes new features arrive. Occasionally features are removed.

Windows as a service was not just for the benefit of we, the users. It is vital to Microsoft in its push to keep Windows competitive with other operating systems, particularly as it faces competition from increasingly powerful mobile operating systems that were built for the modern environment. A two-year or three-year upgrade cycle, combined with the fact that many do not bother to upgrade, is too slow.

Note that automatic upgrade is not controversial on Android, iOS or Chrome OS. Some iOS users on older devices have complained of performance problems, but in general there are more complaints about devices not getting upgraded, for example because of Android operators or vendors not wanting the bother.

Windows as a service has been controversial though. Admins have worried about the extra work of testing applications. There is a Long Term Servicing Channel, which behaves more like the old 2-3 year upgrade cycle, but it is not intended for general use, even in business. It is meant for single-purpose PCs such as those controlling factory equipment, or embedded into cash machines.

Another issue has been the inconvenience of updates. “Restart now” is not something you want to see just before giving a presentation, or working on it at the last minute, for example. Auto-restart occasionally loses work if you have not saved documents.

The biggest worry though is the update going wrong. For example, causing a PC to become unusable. In general this is rare. Updates do fail, but Windows simply rolls back to the previous version, annoying but not fatal.

What about deleting data? Again it is rare; but in this case recovery is not simple. You are in the realm of disk recovery tools, if you do not have a backup. However it turns out that users have reported updates deleting data for some time. Here is one from 4 months ago:

image

Why is the update deleting data? It is not yet clear, and there may be multiple reasons, but many of the reports I have seen refer to user documents stored outside the default location (C:\users\[USERNAME]\). Some users with problems have multiple folders called Documents. Some have moved the location the proper way (Location tab in properties of special folders like Documents, Downloads, Music, Pictures) and still had problems.

Look through miglog.xml though (here is how to find it) and you will find lots of efforts to make sense of the user’s special folder layout. This is not my detailed diagnosis of the issue, just an observation having ploughed through long threads on Reddit and elsewhere; of course these threads are full of noise.

Here is an example of a user who suffered the problem and had an unusual setup: the location of his special folders had been moved (before the upgrade) to an external drive, but there was still important data in the old locations.

We await the official report with interest. But what can we conclude, other than to take backups (which we knew already)?

Two things. One is that Microsoft needs to do a better job of prioritising feedback from its Insider hub. Losing data is a critical issue. The feedback hub, like the forums, is full of noise; but it is possible to identify critical issues there.

This is related of course to the suspicion that Microsoft is now too reliant on unpaid enthusiast testers, at the expense of thorough internal testers. Both are needed and both, I am sure, exist. What though is the proportion and has internal testing been reduced on the basis of these widespread public betas?

The second thing is about priorities. There is a constant frustration that vendors (and Microsoft is not alone) pay too much attention to cosmetics and new features, and not enough to quality and fixing long-standing bugs and annoyances.

What do most users do after Windows upgrades? They are grateful that Windows is up and running again, and go back to working in Word and Excel. They do not care about cosmetic changes or new features they are unlikely to use. They do care about reliability. Such users are not wrong. They deserve better than to find documents missing.

One final note. Microsoft released Windows 10 1809 on 2nd October. However the initial rollout was said to be restricted to users who manually checked Windows Update or used the Update Assistant. Microsoft said that automatic rollout would not begin until Oct 9th. In my case though, on one PC, I got the update automatically (no manual check, no Insider Build setting) on October 3rd. I have seen similar reports from others. I got the update on an HP PC less than a year old, and my guess is that this is the reason:

With the October 2018 Update, we are expanding our use of machine learning and intelligently selecting devices that our data and feedback predict will have a smooth update experience.

In other words, my PC was automatically selected to give Microsoft data on upgrades expected to go smoothly. I am guessing though. I am sure I did not trigger the update myself, since I was away all day on the 2nd October, and buried in work on the 3rd when the update arrived (I switched to a laptop while it updated). I did not lose data, even though I do have a redirected Documents folder. I did see one anomaly: my desktop background was changed from blue to black, and I had to change it back manually.

What should you do if you have this problem and do not have backups? Microsoft asks you to call support. As far as I can tell, the files really are deleted so there will not be an easy route to recovery. The best chance is to use the PC as little as possible; do a low-level copy of the hard drive if you can. Shadow Copy Explorer may help. Another nice tool is Zero Assumption Recovery. What you recover is dependent on whether files have been overwritten by other files or not.

Update: Microsoft has posted an explanation of why the data loss occurred. It’s complicated and all do to with folder redirection (with a dash of OneDrive sync). It affected some users who redirected “known folders” like Documents to another location. The April 2018 update created spurious empty folders for some of these users. The October 2018 update therefore sought to delete them, but in doing so also deleted non-empty folders. It still looks like a bad bug to me: these were legitimate folders for storing user data and should not have been removed if not empty.

More encouraging is that Microsoft has made some changes to its feedback hub so that users can “provide an indication of impact and severity” when reporting issues. The hope is that Microsoft will find reports of severe bugs more easily and therefore take action.

Updated 8th Oct to remove references to OneDrive Sync and add support notes. Updated 10th Oct with reference to Microsoft’s explanatory post.

Review: Synology DS119J. Great system but single bay and underpowered hardware make it worth spending a bit more

Synology has released a new budget NAS, the DS119j, describing it as “An ideal first NAS for the home".

It looks similar to the DS115j which it probably replaces – currently both models are listed on Synology’s site. What is the difference? The operating system is now 64-bit, the CPU now a dual-core ARMv8, though still at 800 MHz, and the read/write performance slightly bumped from 100 MB/s to 108 MB/s, according to the documentation.

I doubt any of these details will matter to the intended users, except that the more powerful CPU will help performance – though it is still underpowered, if you want to take advantage of the many applications which this device supports.

image

What you get is the Diskstation, which is a fairly slim white box with connections for power, 1GB Ethernet port, and 2 USB 2.0 ports. Disappointing to see the slow USB 2.0 standard used here. You will also find a power supply, an Ethernet cable, and a small bag of screws.

image

The USB ports are for attaching USB storage devices or printers. These can then be accessed over the network.

The DS119j costs around £100.

Initial setup

You can buy these units either empty, as mine was, or pre-populated with a hard drive. Presuming it is empty, you slide the cover off, fit the 3.5" hard drive, secure it with four screws, then replace the cover and secure that with two screws.

What disk should you buy? A NAS is intended to be always on and you should get a 3.5" disk that is designed for this. Two common choices are the WD (Western Digital) Red series, and Seagate IronWolf series. At the time of writing, both a 4TB WD Red and a 4TB IronWolf are about £100 from Amazon UK. The IronWolf Pro is faster and specified for a longer life (no promises though), at around £150.

What about SSD? This is the future of storage (though the man from Seagate at Synology’s press event says hard drives will continue for a decade or more). SSD is much faster but on a home NAS that is compromised by accessing it over a network. It is much more expensive for the same amount of storage. You will need a SATA SSD and a 3.5" adapter. Probably not the right choice for this NAS.

Fitting the drive is not difficult, but neither it is as easy as it could be. It is not difficult to make bays in which drives can be securely fitted without screws. Further, the design of the bay is such that you have to angle a screwdriver slightly to turn the screws. Finally, the screw holes in the case are made entirely of plastic and it would be easy to overtighten then and strip the thread, so be careful.

Once assembled, you connect the drive to a wired network and power it on. In most home settings, you will attach the drive to a network port on your broadband router. In other cases you may have a separate network switch. You cannot connect it over wifi and this would anyway be a mistake as you need the higher performance and reliability of a cable connection.

To get started you connect the NAS to your network and therefore to the internet, and turn it on. In order to continue, you need to find it on the network which you can do in one of several ways including:

– Download the DS Finder app for Android or iOS.

– Download Synology Assistant for Windows, Mac or Linux

– Have a look at your DHCP manager (probably in your router management for home users) and find the IP address

If you use DS Finder you can set up the Synology DiskStation from your phone. Otherwise, you can use a web browser (my preferred option). All you need to do to get started is to choose a username and password. You can also choose whether to link your DiskStation with a Synology account and create a QuickConnect ID for it. If you do this, you will be able to connect to your DiskStation over the Internet.

The DiskStation sets itself up in a default configuration. You will have network folders for music, photo, video, and another called home for other documents. Under home you will also find Drive, which behaves like a folder but has extra features for synchronization and file sharing. For full use of Drive, you need to install a Drive client from Synology.

image

If you attach a USB storage device to a port on the DS119j, it shows up automatically as usbshare1 on the network. This means that any USB drive becomes network storage, a handy feature, though only at USB 2.0 speed.

Synology DSM (Disk Station Manager)

Synology DSM is a version of Linux adapted by Synology. It is mature and robust, now at version 6.2. The reason a Synology NAS costs much more than say a 4TB WD Elements portable USB drive is that the Synology is actually a small server, focused on storage but capable of running many different types of application. DSM is the operating system. Like most Linux systems, you install applications via a package manager, and Synology maintains a long list of packages encompassing a diverse range of functions from backup and media serving through to business-oriented applications like running Java applications, a web server, Docker containers, support ticket management, email, and many more.

DSM also features a beautiful windowed user interface all running in the browser.

image

The installation and upgrade of packages is smooth and whether you consider it as a NAS, or as a complete server system for small businesses, it is impressive and (compared to a traditional Windows or Linux server) easy to use.

The question in relation to the DS119j is whether DSM is overkill for such a small, low-power device.

Hyper Backup

Given that this NAS only has a single drive, it is particularly important to back up any data. Synology includes an application for this purpose, called Hyper Backup.

image

Hyper Backup is very flexible and lets you backup to many destinations, including Amazon S3, Microsoft Azure, Synology’s own C2 cloud service, or to local storage. For example, you could attach a large USB drive to the USB port and backup to there. Scheduling is built in.

I had a quick look at the Synology C2 service. It did not go well. I use the default web browser on Windows 10, Edge, and using Hyper Backup to Synology C2 just got me this error message.

image

I told Edge to pretend to be Firefox, which worked fine. I was invited to start a free trial. Then you get to choose a plan:

image

Plans start at €9.99 + VAT for 100GB for a year. Of course if you fill your 4TB drive that will not be enough. On the other hand, not everything needs to be backed up. Things like downloads that you can download again, or videos ripped from disks, are not so critical, or better backed up to local drives. Cloud backup is ideal though for important documents since it is an off-site backup. I have not compared prices, but I suspect that something like Amazon S3 or Microsoft Azure would be better value than Synology C2, though integration will be smooth with Synology’s service. Synology has its own datacentre in Frankfurt so it is not just reselling Amazon S3; this may also help with compliance.

An ideal first NAS?

The DS119j is not an ideal NAS for one simple reason: it has only a single bay so does not provide resilient storage. In other words, you should not have data that is stored only on this DiskStation, unless it is not important to you. You should ensure that it is backed up, maybe to another NAS or external drive, or maybe to cloud storage.

Still, if you are aware of the risks with a single drive NAS and take sensible precautions, you can live with it.

I like Synology DSM which makes the small NAS devices great value as small servers. For home users, they are great for shared folders, media serving (I use Logitech Media Server with great success), and PC backup. For small business, they are a strong substitute for the role which used to be occupied by Microsoft’s Small Business Server as well as being cheaper and easier to use.

If you only want a networked file share, there are cheaper options from the likes of Buffalo, but Synology DSM is nicer to use.

If you want to make fuller use of DSM though, this model is not the best choice. I noticed the CPU often spiked just using the control panel and package manager.

image

I would suggest stretching to at least the DS218j, which is similar but has 2 bays, 500MB of RAM and a faster CPU. Better still, I like the x86-based Plus series – but a 2-bay DS218+ is over £300. A DS218j is half that price and perhaps the sweet spot for home users.

Finally, Synology could do better with documentation for the first-time user. Getting started is not too bad, but the fact is that DSM presents you with a myriad of options and applications and a better orientation guide would be helpful.

Conclusion? OK, but get the DS218j if you can.