Meizu M3 Max: Android 6.0 phablet, good value if you don’t mind Flyme OS

Meizu, one of the top ten smartphone manufacturers in China, has just brought out the M3 Max, an Android 6.0 phablet currently on offer for $224.99 (around £185), which seems great value for a 6.0″ smartphone complete with dual SIMs slots and fingerprint reader. I have been using it for a while to see how it stacks up against the competition.


My M3 Max is a sample, and while I believe it matches the production model in terms of hardware, you may find a few more concessions to non-Chinese users in the version for European and US markets. That said, my sample does include the Google Play Store and a thing called GMS Installer which assists installation of the Google Mobile Services required for Google-flavoured Android, which is what most users in countries like the UK and USA require.

This was my first experience of Meizu’s Flyme OS, a custom version of Android, and the distinctive one-button control. The front button on the M3 Max has multiple functions. Tap lightly and it is a back button. Press and click and it is a home button. Rest your finger and it is a fingerprint reader. And if you are wondering how to switch applications, that is a swipe up from the bottom of the screen.

I like having a hardware button, but I am not convinced that one button improves on the traditional Android three buttons: back, home, and app switcher. I also prefer the fingerprint reader on the back, as on recent Huawei phones. That said, I soon got used to it. You can register more than one fingerprint, and I found it useful to register my right thumb I can pick up the phone and tap my thumb on the front to unlock it.

Setting the phone up was a little more challenging than with Android devices designed primarily for our market. Meizu/Flyme has alternative apps for common requirements such as web browser, maps, music and even app store. I found myself downloading a bunch of apps to get a more familiar experience, including the Google Chrome browser, OneDrive, Outlook, Twitter, Facebook and Spotify. I did have a few issues with the Play store initially – it would open and immediately crash – but things seemed to settle down after I applied a few updates.

There are a few compromises in a phone at this price point. The fingerprint reader is not the equal of the one on the Huawei P9 or Honor 8, for example, taking longer to register my fingerprint and requiring slightly more careful positioning to read it, but it still works satisfactorily. In day to day use I have no complaints about the responsiveness of the OS or the battery life.

Physically the M3 Max has a metal body and a smooth finish. The design is straightforward but pleasant enough. The case is 7.9mm thick, which makes it a relatively thin device if that is important to you. It is somewhat heavy though, about 190g, though in return you get a reassuringly solid feel.

There are a few compromises in a phone at this price point. The fingerprint reader is not the equal of the one on the Huawei P9 or Honor 8, for example, taking longer to register my fingerprint and requiring slightly more careful positioning to read it, but it still works satisfactorily. In day to day use I have no complaints about the responsiveness of the OS or the battery life.

Physically the M3 Max has a metal body and a smooth finish. The design is straightforward but pleasant enough. The case is 7.9mm thick, which makes it a relatively thin device if that is important to you. It is somewhat heavy though, about 190g, though in return you get a reassuringly solid feel.

The Flyme skin supports floating windows after a fashion.


Even on a 6″ device though, it is not all that useful since you can only really make use of one app at a time.

Swipe down from the top to reveal notifications and the usual array of Android shortcuts.


The camera is nothing spectacular but does cover most of the features you are likely to want. Tap the Auto button to reveal popular features like Panorama and Macro. This is also the route to video recording.


If you choose Manual on this screen, you can make your own settings for
Exposure time

  • ISO
  • Focus
  • Exposure compensation
  • Saturation
  • Contrast
  • White balance

A decent range of controls.

The Settings button lets you specify photo size as well as other features like grid lines.


Benchmarks and specifications

I ran some benchmarks. PC Mark came up with a score of 3156 for its Work 2.0 performance.


Geekbench 4.0.1 delivered:

  • 1475 RenderScript Score
  • 683 Single-Core Score
  • 2670 Multi-Core Score

While these results are unexciting, at this price point they are more than reasonable.


  • Android 6
  • ARM MT6755M 1 GHz 8 core CPU
  • 6” display, 1080×1920, 480 ppi
  • Capacitive touch screen
  • GPS
  • 3GB RAM
  • 64GB storage
  • Second SIM slot can also be used for up to 128GB SD card
  • Mali-T860 GPU
  • 13MP rear camera
  • 5MP front camera
  • 4100 mAH battery
  • Weight 190g
  • Size 163.4 x 81.6 x 7.9mm


Meizu is not a well-known brand in the UK or USA, but they are a major Chinese vendor, though pitching towards the lower end of the market. This is a good value device and a solid choice if you are looking for a phablet-style phone in this price range and can put up with a slightly less familiar Android experience.

You can purchase from here.

Review: Libratone Zipp Mini

I am quite taken with this Libratone wireless speaker, though I had a few setup hassles. The device comes in a distinctive cylindrical box with a nightingale image on the top. Unpack it and you get a medium-size desktop (or table or shelf) speaker, around 22cm high, with a colourful cover that looks zipped on and a carry strap. There is also a power supply with UK and European adaptors, and a very brief instruction leaflet.


Plug in, and the device starts charging. The leaflet says to download the app (for iOS or Android) and “set up and play”. It was not quite so easy for me, using Android. The app is over-designed, by which I mean it looks great but does not always work intuitively. It did not find the speaker automatically, insisted that a wi-fi connection was better than Bluetooth, but gave me no help connecting.

After tinkering for a bit I went to the website and followed the steps for manual wi-fi setup. Essentially you temporarily disconnect from your normal Wi-fi connection, connect your wi-fi directly to the Zipp, go to in the browser, select your home wi-fi network, enter the password, and you are done.

Everything worked perfectly after that. I fired up Spotify, played some music, selected the Zipp under Spotify Connect, and it sounded great. For some Android apps you may need a Bluetooth connection though, or you can use DLNA. The beauty of Spotify Connect is that the connection is direct from the speaker to the internet, it does not depend on the app running, so you can switch off your phone and it still plays. It is actually a better solution than Apple Airplay for internet streaming.

The Nightingale button

Control is either via the app, or through the Nightingale button on the top of the speaker. The button works really well. Tap to pause or resume. Slide finger clockwise or anti-clockwise for volume. Skip forward or back by tapping the right or left edge. Then there is a neat “hush” feature: place your hand over the button and it mutes temporarily.

A bit more about the sound. Although this is the smaller Zipp Mini, you can tell that Libratone has taken trouble to make it sound good, and it is impressively rich and full considering the size of the unit. You are getting your money’s worth, despite what seems a high price.

I spent some time comparing the Zipp with Squeezebox Radio, another (but sadly discontinued) wireless audio device I rate highly. Both are mono, both sound good. I did notice that the Zipp has deeper bass and a slightly softer more recessed treble. I cannot decide for sure which sounds better, but I am slightly inclined towards the Libratone, which is actually high praise.

One lovely feature of the Zipp is internet radio, which comes via Vtuner. This is hidden in the feature called Favourites. You select favourite radio stations in the app, with the default being BBC stations and Classic FM. You can change your favourites by tapping the Nightingale icon in the app (another hidden, over-designed feature) and tapping My Radio.


Once set up, tap the heart button on the Nightingale button on the device to switch to radio. Tap twice to skip to the next station. Internet radio does not depend on having the app running, it works directly from the Zipp.

The Zipp has a power button, press and hold to power on or off, tap to show remaining battery. It also has an aux jack socket, for wired playback from any source, and a USB socket which you can use either for charging a phone, or for playback from music files on USB storage (I did not try this, but a wide range of formats are supported, including MP3, WAV, FLAC, Ogg Vorbis, WMA, AAC, AIFF and ALAC). You can also use USB for wired playback from iOS, but not from other devices.

Apple Airplay is supported and worked great when I tried it with an iPad. One thing to note: there is currently no iPad app, so you have to search for the iPhone app, which does also work on the iPad.

This very flexible device also supports Bluetooth 4.1 and you can use it as a speaker phone, just tap the Nightingale button to answer a call, so yes it has a microphone too. It also supports DLNA which means you can “play to” the device on some applications, such as Windows Media Player.

If you have more than one Zipp you can connect them for multi-speaker playback. You can select Stereo if you have two speakers or more, but Libratone recommend something they call FullRoom, which means leave it to their digital signal processing (DSP).

Sadly I only have one Zipp, but there are a few options in the app to set DSP optimization for things like Outdoor, Shelf and Floor. I did not notice a huge difference.

You can get different colour covers, and I tried removing mine. It is a bit fiddly, and the current Zipp Mini does not quite match the explanation on the Libratone site. The handle on this Zipp does not come off; you unzip the cover, twist to disconnect the zip, then feed the handle through the hole. Not something you are likely to do often.

The device naked

Finally, if you are curious like me, here are some specifications:

  • Class D amplifier
  • 1 x 3” woofer, 1 x 1” tweeter 2 x 3.5” low frequency radiators
  • Frequency response 60-20,000 Hz (no dB range specified)
  • Maximum volume 96 dB SPL/1m
  • 2400 mAhs battery
  • Bluetooth 4.1
  • 10 hours of playback approx.

Conclusion? I really like the Zipp Mini. It sounds great, supports a wide range of standards, and works well for Internet radio. I like the appearance, the Nightingale button is elegant, and you can expand it with more speakers if needed. This or the larger Zipp model might be all the hi-fi you need.

Caveats: many of the features are a bit hidden, initial setup I found fiddly, the supplied instructions are hopelessly inadequate, and with all those choices it can get confusing.

No matter, it is a lovely device.

More information on the vendor’s site here.

Microsoft improves Windows Subsystem for Linux: launch Windows apps from Linux and vice versa

The Windows 10 anniversary update introduced a major new feature: a subsystem for Linux. Microsoft marketing execs call this Bash on Windows; Ubuntu calls it Ubuntu on Windows; but Windows Subsystem for Linux is the most accurate description. You run a Linux binary and the subsystem redirects system calls so that it behaves like Linux.


The first implementation (which is designated Beta) has an obvious limitation. Linux works great, and Windows works great, but they do not interoperate, other than via the file system or networking. This means you cannot do what I used to do in the days of Services for Unix: type ls at the command prompt to get a directory listing.

That is now changing. Version 14951 introduces interop so that you can launch Windows executables from the Bash shell and vice versa. This is particularly helpful since the subsystem does not support GUI applications. One of the obvious use cases is launching a GUI editor from Bash, such as Visual Studio Code or Notepad++.

The nitty-gritty on how it works is here.


Limitations? A few. Environment variables are not shared so an executable that is on the Windows PATH may not be on the Linux PATH. The executable also needs to be on a filesystem compatible with DrvFs, which means NTFS or ReFS, not FAT32 or exFAT.

This is good stuff though. If you work on Windows, but love Linux utilities like grep, now you can use them seamlessly from Windows. And if you are developing Linux applications with say PHP or Node.js, now you can develop in the Linux environment but use Windows editors.

Note that this is all still in preview and I am not aware of an announced date for the first non-beta release.

Microsoft sets Visual Studio LightSwitch to off

Microsoft has officially announced the end of development of LightSwitch, a rapid application builder for desktop and mobile applications.

LightSwitch was introduced in July 2011 as a tool to build multi-tier applications using a data-first approach. You can design you database using an excellent visual designer, design screens for viewing and editing the data using a non-visual designer, and generate applications with the server-side code hosted either on your own server or on Microsoft Azure. The client application in the original LightSwitch was based on Silverlight, but this was later extended with an option for HTML. You can get a feel for the general approach from my early hands-on here.

As I noted at the time, LightSwitch abstracts a number of difficult tasks. This is a good thing, though as with any application generated you had to take time to learn its quirks. That said, it is more usable than most model-driven development tools, in my experience.

LightSwitch had some bad luck. It was conceived at a time when Silverlight looked like the future of Microsoft’s client development platform, but by the time it launched Silverlight was heading for obsolescence. It also fell victim to ideologies within Microsoft (which persist today) that chase the dream of code-free application development that anyone can do. The documentation for LightSwitch on launch was dreadful, a series of how-tos that neglected to explain how the tool worked. You had to get the software development kit, aimed at those building LightSwitch components, to have any hope of understanding the tool.


The abandonment of LightSwitch is not a surprise. Microsoft had stopped talking about it and adoption was poor. There will be no tooling for it in the next Visual Studio, though you can keep using it for a while if you want.

I think it is a shame since it is a promising tool and I cannot help thinking that with more intelligent positioning and a few tweaks to the product and its documentation it could have been a success. Those who did get to grips with it found it very good.

What is unfortunate is that Microsoft has lost the faith of many developers thanks to the many shifts in its development strategy. I know component vendors have also been caught out by the Silverlight and then LightSwitch debacle. Here is one of the comments on the announcement:

Microsoft keeps doing this over and over, we invest months even years to master a technology, just to find out it’s being phased out prematurely. Perfectly good, one-of-a-kind niche tools too. So much investment on both sides (both MS and customers) down the drain. What’s worse, it is done is a non-transparent, dishonest manner, letting things dry up over a couple years so that when the announcement comes, no-one really cares any more, no more noise – just look at this blog.

This makes it hard for the company to convince developers that its new strategies de jour have a longer life ahead of them. I am thinking of the UWP (Universal Windows Platform), which has already changed substantially since its first conception, and of PowerApps, the supposed replacement for LightSwitch, and yet another attempt to promote code-free development.

Developers do not want code-free development. They like tools that do stuff for them, if they are intuitive and transparent, but they also like an easy route to adding and modifying code in order to have the application work the way they want.

Bob Dylan’s Nobel Prize for Literature

Yesterday, Bob Dylan was awarded the Nobel Prize for Literature.

Swedish Academy Permanent Secretary Sara Danius announces that Bob Dylan has been awarded the Nobel Prize for Literature

The reason stated was “for having created new poetic expressions within the great American song tradition”.

Of course I am delighted, as a long-time Dylan fan. I am not quite his generation, I was too young to take in what was happening in the sixties, but I was there for the release of Blood on the Tracks in 1974, among his best releases. In fact not having much money at the time, I won my copy from the local paper in a competition.


I cannot therefore be objective about this award (and you can debate whether objectivity in literary criticism exists), but I do have some reflections.

First, this feels like some sort of establishment recognition that superlative literature has emerged from the popular music of my generation. In one sense Dylan stands as representative among others who could plausibly have been given this award, most obviously Leonard Cohen – who was a poet before he was a songwriter – but also Joni Mitchell and perhaps others, Paul Simon, David Bowie, Paul McCartney, Bruce Springsteen, Van Morrison, Ray Davies, Tom Waits, Jackson Browne, Elvis Costello, and I am sure you can add to this list.

That said, Dylan is both like and unlike others in this kind of list. Again, I cannot be objective, but it has always seems to me that Dylan’s ability to reinvent language, to innovate with words, is something that sets him apart. There are many examples; one that comes to mind is in Jokerman, where Dylan sings of:

False-hearted judges dying in the webs that they spin
only a matter of time ‘til night comes steppin’ in

On a quick read this looks clichéd: judges spinning words, night stepping in. Then you look more closely. The judges are spinning webs, not words, and they are then caught like flies in the webs of their own making. It is a powerful image, and packs two meanings of “spin”, words and webs, into one thought.

And what about night stepping in, what is that about? It sounds like a cliché, but has anyone else used that image? And you can muse about the meaning of “stepping in”; does it mean imposing authority, like a teacher or policeman or God “stepping in”; or is it stepping in like a dancer, a beautiful, natural restoration of order?

There is no answer to these questions, but you can say that Dylan’s work, at its best, rewards study and reflection in a way that few can match.

Christopher Ricks, a Dylan critic with impeccable credentials (Professor of English at Bristol, Cambridge and now Boston University), is happy to make the comparison with Shakespeare:

"I think Shakespeare sought the widest possible constituency. One reason I keep mentioning Shakespeare is not because I think Dylan is a genius, which I do, but because I think that like Shakespeare he sought the widest possible constituency.

I believe the award to be merited then. I also acknowledge though that Dylan does not fit the norms of great writers. He is instinctive; he says he writes quickly, and it seems that he does not curate his own output with the care that characterises most poets.

There is also the awkward question of what the words mean, in many of his songs, and whether that matters. Trying to puzzle out meanings is part of the fun, but also gets you lost in a maze, wondering if Dylan is having some kind of joke at your expense.

In Tangled up in Blue, a woman:

opened up a book of poems and handed it to me
written by an Italian poet from the thirteenth century
and every one of those words rang true
and glowed like burnin’ coal
pourin’ off of every page
like it was written in my soul from me to you

Which poet? Dante? Petrach (actually 14th century)? Dylan once told Craig McGregor it was Plutarch perhaps meaning Petrach? Then again he sometimes seems to sing “from the fifteenth century”. And often he changes the words completely, mentioning Charles Baudelaire, or the Bible, with various references from Jeremiah. And in London on 25 Oct 2015 he sang:

she opened up a book of poems and she said them to me slow, you know
memorise these lines and remember these rhymes
when you’re out there walking to and fro
Every one of those words rang true and glowed like burning coals
pouring off of every page like it was written in my soul from me to you

We can conclude I think that Dylan did not care much about which book it was, or which poet. He cares more about the lines beginning “Every one of those words rang true”, which he uses in most variants, though many performances omit the verse entirely.

I mention this to show that Dylan is a slippery subject, and as a warning to anyone who purchases a book of his lyrics that they are not reading the whole story. In some cases, the printed lyrics are simply wrong, as in these words from Subterranean Homesick Blues:

Walk on your tiptoes
Don’t try “No-Doz”

The lyric is actually “Don’t tie no bows” and we have evidence:


It is true that one of the cards in the famous video for the song says No Dose:


but those cards are a humorous counterpoint to the words, not a transcription of the lyrics, and probably written by Allen Ginsberg (the man on the left in the video) rather than Dylan.

Incidentally, the lack of any reference to “don’t tie no bows” is why I cancelled my order for the very expensive edition of the Lyrics since 1962, supposedly giving variations, published in 2014 and edited by Ricks and Lisa Nemrow. Someone needs to do a much better job, encompassing live performances as well as released albums.

You never know then when Dylan is playing with you, or just being careless, and sometimes he just throws words together in evocative ways and the search for explicit meaning is unrewarding.

I encourage anybody who has not done so to explore the work of Bob Dylan, though recognising that it is not for everyone, and some cannot get through the “voice like sand and glue”, as David Bowie put it.

The Nobel prize is deserved, but it is a curious body of work, and it will be a long time before we can get anything like a true perspective on it.

For me that does not matter; I am happy to enjoy it.

Notes from the field: Office 365 Cutover Migration for a small business and the mysteries of mail-enabled users

I assisted a small company in migrating from Small Business Server 2011 to Office 365.

SBS 2011 was the last full edition of Small Business Server, with Exchange included. It still works fine but is getting out of date, and Microsoft has no replacement other than full Exchange and multiple servers at far greater cost, or Office 365.

There must be hundreds of thousands of businesses who have done this or will do it, and you would expect Microsoft’s procedures to be pretty smooth by now. I have done this before, but not for a couple of years, so was interested to see how it now looks.

The goal here is to migrate email (I am not going to cover SharePoint or other aspects of migration here) in such a way that no email or other Oulook data in lost, and that users have a smooth transition from using an internal mail server to using Office 365.

What you do first is to set up the Office 365 tenant and add the email domain, for example You do not complete the DNS changes immediately, in particular the MX record that determines where incoming mail is sent.

Now you have a few choices. In the new Office 365 Admin center, in the Users section, there is a section called Data Migration, which has an option for Exchange. “We will … guide you through the rest of the migration experience,” it says.

If you select Exchange you are offered the Office 365 Hybrid Configuration Wizard. You do not want to use this for Small Business Server. It sets up a hybrid configuration with Exchange Federation Trust, for a setup where Office 365 and on-premises Exchange co-exist. Click on this image if you want to know more. I have no idea if it would work but it is unnecessarily complicated.


No, what you should do is go down the page and click “Exchange Online migration and deployment guidance for your organisation”. Now we have a few options, the main relevant ones being Cutover and Hybrid 2010. Except you cannot use Hybrid 2010 if you have a single-server setup, because this requires directory synchronization. And you cannot install DirSync, nor its successor Azure AD Connect, on a server that is a Domain Controller.

So in most SBS cases you are going to do a Cutover migration, suitable for “fewer than 2000 mailboxes” according to Microsoft. The SBS maximum is 75 so you should be fine.

Click Cutover Migration and you get to a nice migration assistant with 15 steps. Let’s get started.


So I did, and while it mostly works there are some gotchas and I am not impressed with the documentation. It has a combination of patronising “this is going to be easy” instructions with links that dump you into other documents that are more general, or do not cover your exact situation, particularly in the case of the mysterious “Create mail-enabled users” of which more below.

Steps 1-5 went fine and than I was on step 6, Migrate your mailboxes. This guides you to the Migration Batch tool. This tool connects to your SBS Exchange, creates Office 365 users for each Exchange mailbox if they do not already exist, and then copies all the contents of those mailboxes to the new mailboxes in Office 365.


While this tool is useful, I found I had what seemed to me obvious questions that the documentation, such as it is, does not address. One is, what do you do if one or more mailboxes fail to sync, or sync with errors reported, which is common. The document just advises you to look at the log files. What if you stop and then resume a migration batch, what actually happens? What if you delete and recreate a migration batch (as support sometimes advises), do you get duplicate items? Do you need to delete the existing users? How do you get to the Finalized state for a mailbox? It would be most helpful if Microsoft would provide detailed documentation for this too, but if it does, I have not found it.

The migration can take a long time, depending of course on the size of your mailboxes and the speed of your connection. I was lucky, with just 11 users it tool less than a day. I have known this tool to run for several days; it could take weeks over an ADSL connection.

Note that even when all mailboxes are synced, mail is still flowing to on-premises Exchange, so the sync is immediately out of date. You are not done yet.

The mysteries of converting to Mail-Enabled Users

I got to Synced after only a few hiccups. Now comes the strange bit. Step 7 is called Create mail-enabled users.



There are numerous problems with this step. It does not fully explain the implications of what it describes. It does not actually work without tweaking. The documentation is sloppy.

Do you need to do this step at all? No, but it does have some advantages. What it does is to remove (actually disconnect rather than delete) the on-premises mailbox from each user, and set the TargetAddress attribute in Active Directory, which tells Exchange to route mail to the TargetAddress rather than trying to deliver it locally. The TargetAddress, which is only viewable through ADSI Edit or command-line tools, should be set to the unique Office 365 email address for each users, typically, rather than the main email address. If I have this right (and it is not clearly explained), this means that any email that happens to arrive at on-premises Exchange, either because of old MX records, or because the on-premises Exchange is hard-coded as the target server, then it gets sent to Office 365.

Update: there is one scenario where you absolutely DO need this step. This is if you want to use ADConnect to synch on premise AD with Office 365, after doing the mail migration. See this thread and the comment:

“To covert on-premises mailboxes to mail-enabled users is required. When you convert on-premises mailboxes to mail-enabled users (MEUs), the proxy addresses and other information from the Office 365 mailboxes are copied to the MEUs, which reside in Active Directory in your on-premises organization. These MEU properties enable the Directory Synchronization tool, which you activate and install in step 3, to match each MEU with its corresponding cloud mailbox.”

The documentation for this step explains how to create a CSV file with the primary email addresses of the users to convert (this works), and then refers you to this document for the PowerShell scripts to complete the step. You will note that this document refers to Exchange 2007, though the steps also apply to Exchange 2010, and to a Staged Exchange migration, when you are doing a Cutover. Further, the scripts are embedded in the text, so you have to copy and paste. Further, the scripts do not work if you try to follow the instructions exactly. There are several issues.

First, this step seems to be in the wrong place. You should change the MX records to route mail to Office 365, and then leave an interval of at least a few hours, before doing this step. The reason is that once you convert SBS users to mail-enabled users, the Migration tool will not be able to re-sync their mailbox. You must complete a sync immediately before doing the conversion. The only way I know to force a sync is to stop and then resume the Migration Batch. Check that all mailboxes are synced, which only takes a few minutes, before doing the conversion. You may still lose an email if it arrives in the window between the last sync and the conversion, which is why you should change the MX records first.

Second, if you run ExportO365UserInfo.ps1 in the Small Business Server Exchange Shell, it will not work, since “By default, Import-PSSession does not import commands that have the same name as commands in the current session.” This means that when the script runs mailbox commands they run against the local Exchange server rather than Office 365, unless you use the –AllowClobber parameter. I found the solution was to run this script on another machine.

Third, the script still does not work, since, in my case at least, the Migration Batch did not populate the email address for imported users. I fixed this with a handy script.

Note that the second script, Exchange2007MBtoMEU.ps1, must be run in the SBS server Exchange Shell, otherwise it will not work.

Bearing in mind all these hazards, you might think that the whole, not strictly necessary, step of converting to mail-enabled users is not worth it. That is perfectly reasonable.

Finishing the job

Bearing in mind the above, the next steps do not altogether make sense. In particular, step 11, which says to make sure that:

“Office 365 mailboxes were synchronized at least once after mail began being sent directly to them. To do this, make sure that the value in the Last Synced Time box for the migration batch is more recent than when mail started being routed directly to Office 365 mailboxes.”

In fact, you will get errors here if you followed Step 7 to create mail-enabled users. Did anyone at Microsoft try to follow these steps?

Still, I have to say that the outcome in our case was excellent. Everything was copied correctly, and the Migration Batch tool even successfully replicated fiddly things like calendar permissions. The transition was smooth.

Note that you should not attempt to point an existing Outlook profile at the Office 365 Exchange. Instead, create a new profile. Otherwise I am not sure what happens; you probably get thousands of duplicate items.

One puzzle. I did not spot any duplicates in the synced mailboxes, but the item count increased by around 20% compared to the old mailboxes, as reported by PowerShell. Currently a mystery.

Closing words

I am puzzled that Microsoft does not have any guidance specifically for Small Business Server migrations, given how common these are, as well as by the poor and inaccurate documentation as noted above.

There are perhaps two factors at play. One is that Microsoft expects businesses of any size to use partners for this kind of work, who specialise in knowing the pitfalls. Second, the company seems so focused on enterprises that the needs of small businesses are neglected. Note, for example, the strong push for businesses to use the Azure AD Connect tool even though this requires a multi-server setup. There is a special tool in Windows Server Essentials, but this does not apply for businesses using a Standard edition of Small Business Server.

Finally, note that there are third-party tools you can use for this kind of migration, in particular BitTitan’s MigrationWiz, which may well be easier though a small cost is involved.