Category Archives: cloud computing

From Windows Embedded to cloud: Microsoft announces the Connected Vehicle Platform

Microsoft has announced the Connected Vehicle Platform, at the CES event under way in Las Vegas.

image

The company is not new to in-car systems, but its track record is disappointing. It used to be all about Windows Embedded, using Windows CE to make a vehicle into a smart device.

Ford was Microsoft’s biggest partner. It built Ford SYNC on the platform and in 2012 announced five years of partnership and 5 million SYNC-enabled vehicles.

However in 2014 Ford announced SYNC 3 with no mention of Microsoft – because SYNC 3 uses Blackberry’s QNX.

What went wrong? There’s a 2014 analysis from Bill Howard that offers a few clues. The bit that chimes with me is that Microsoft was too slow in updating the system. The overall Windows story over the last 10 years is convoluted to say the least, with many changes to the platform and disruptive (in a bad way) strategy shifts. The same factor is a large part of why Windows Phone failed.

It is not clear at this stage whether or not Microsoft’s Connected Vehicle Platform partners (which include Renault-Nissan and BMW) will use Windows Embedded in their solutions; but what is notable is that Microsoft’s release makes no mention of it. The company has shifted to a cloud strategy, and is primarily offering Azure services rather than mandating how manufacturers choose to consume them. The detail of the announcement identifies five key areas:

  • Telematics and Predictive services
  • Marketing (“Customer insights and engagement”)
  • Productivity (Office 365, Skype)
  • Connected ADAS (Advanced Driver Assistance Systems), ie. the car helping you to drive
  • Advanced Navigation

Cortana also gets a mention. We may think of Cortana as a virtual assistant, but what this means is a user interface to intelligent services.

There is big competition for all this of course, with Google, Amazon and Apple also in this space. There is also politics involved. If you read Howard’s analysis linked above, note that he mentions how the auto companies dislike restrictions such as Google insisting that you can’t have Google Search unless you also use Google Maps (I have no idea if this is still the case). There is a tension here. In-car systems are an important value-add for customers and critical to marketing vehicles, but the auto companies do not want their vehicles to become just another channel for big data-gathering companies like Google and Amazon.

Another point of interest is how smartphones interact with your car. If you want a simple and integrated experience, you can just dock your phone and use it for navigation, communication and entertainment – three key areas for in-car systems. On the other hand, a docked phone will not have the built-in screen and control of vehicle features that an embedded system can offer.

Amazon Web Services opens London data centers

Amazon Web Services (AWS) has opened a London Region, fulfilling its promise to open data centers in the UK, and joining Microsoft Azure which opened UK data centers in September 2016.

This is the third AWS European region, joining Ireland and Germany, and a fourth region, in France, has also been announced.

A region is not a single data center, but is comprised of at least two “Availability Zones”, each of which is a separate data center.

Notes from the field: Office 365 Cutover Migration for a small business and the mysteries of mail-enabled users

I assisted a small company in migrating from Small Business Server 2011 to Office 365.

SBS 2011 was the last full edition of Small Business Server, with Exchange included. It still works fine but is getting out of date, and Microsoft has no replacement other than full Exchange and multiple servers at far greater cost, or Office 365.

There must be hundreds of thousands of businesses who have done this or will do it, and you would expect Microsoft’s procedures to be pretty smooth by now. I have done this before, but not for a couple of years, so was interested to see how it now looks.

The goal here is to migrate email (I am not going to cover SharePoint or other aspects of migration here) in such a way that no email or other Oulook data in lost, and that users have a smooth transition from using an internal mail server to using Office 365.

What you do first is to set up the Office 365 tenant and add the email domain, for example yourbusiness.co.uk. You do not complete the DNS changes immediately, in particular the MX record that determines where incoming mail is sent.

Now you have a few choices. In the new Office 365 Admin center, in the Users section, there is a section called Data Migration, which has an option for Exchange. “We will … guide you through the rest of the migration experience,” it says.

If you select Exchange you are offered the Office 365 Hybrid Configuration Wizard. You do not want to use this for Small Business Server. It sets up a hybrid configuration with Exchange Federation Trust, for a setup where Office 365 and on-premises Exchange co-exist. Click on this image if you want to know more. I have no idea if it would work but it is unnecessarily complicated.

image

No, what you should do is go down the page and click “Exchange Online migration and deployment guidance for your organisation”. Now we have a few options, the main relevant ones being Cutover and Hybrid 2010. Except you cannot use Hybrid 2010 if you have a single-server setup, because this requires directory synchronization. And you cannot install DirSync, nor its successor Azure AD Connect, on a server that is a Domain Controller.

So in most SBS cases you are going to do a Cutover migration, suitable for “fewer than 2000 mailboxes” according to Microsoft. The SBS maximum is 75 so you should be fine.

Click Cutover Migration and you get to a nice migration assistant with 15 steps. Let’s get started.

image

So I did, and while it mostly works there are some gotchas and I am not impressed with the documentation. It has a combination of patronising “this is going to be easy” instructions with links that dump you into other documents that are more general, or do not cover your exact situation, particularly in the case of the mysterious “Create mail-enabled users” of which more below.

Steps 1-5 went fine and than I was on step 6, Migrate your mailboxes. This guides you to the Migration Batch tool. This tool connects to your SBS Exchange, creates Office 365 users for each Exchange mailbox if they do not already exist, and then copies all the contents of those mailboxes to the new mailboxes in Office 365.

image

While this tool is useful, I found I had what seemed to me obvious questions that the documentation, such as it is, does not address. One is, what do you do if one or more mailboxes fail to sync, or sync with errors reported, which is common. The document just advises you to look at the log files. What if you stop and then resume a migration batch, what actually happens? What if you delete and recreate a migration batch (as support sometimes advises), do you get duplicate items? Do you need to delete the existing users? How do you get to the Finalized state for a mailbox? It would be most helpful if Microsoft would provide detailed documentation for this too, but if it does, I have not found it.

The migration can take a long time, depending of course on the size of your mailboxes and the speed of your connection. I was lucky, with just 11 users it tool less than a day. I have known this tool to run for several days; it could take weeks over an ADSL connection.

Note that even when all mailboxes are synced, mail is still flowing to on-premises Exchange, so the sync is immediately out of date. You are not done yet.

The mysteries of converting to Mail-Enabled Users

I got to Synced after only a few hiccups. Now comes the strange bit. Step 7 is called Create mail-enabled users.

 

image

There are numerous problems with this step. It does not fully explain the implications of what it describes. It does not actually work without tweaking. The documentation is sloppy.

Do you need to do this step at all? No, but it does have some advantages. What it does is to remove (actually disconnect rather than delete) the on-premises mailbox from each user, and set the TargetAddress attribute in Active Directory, which tells Exchange to route mail to the TargetAddress rather than trying to deliver it locally. The TargetAddress, which is only viewable through ADSI Edit or command-line tools, should be set to the unique Office 365 email address for each users, typically username@yourbusiness.onmicrosoft.com, rather than the main email address. If I have this right (and it is not clearly explained), this means that any email that happens to arrive at on-premises Exchange, either because of old MX records, or because the on-premises Exchange is hard-coded as the target server, then it gets sent to Office 365.

Update: there is one scenario where you absolutely DO need this step. This is if you want to use ADConnect to synch on premise AD with Office 365, after doing the mail migration. See this thread and the comment:

“To covert on-premises mailboxes to mail-enabled users is required. When you convert on-premises mailboxes to mail-enabled users (MEUs), the proxy addresses and other information from the Office 365 mailboxes are copied to the MEUs, which reside in Active Directory in your on-premises organization. These MEU properties enable the Directory Synchronization tool, which you activate and install in step 3, to match each MEU with its corresponding cloud mailbox.”

The documentation for this step explains how to create a CSV file with the primary email addresses of the users to convert (this works), and then refers you to this document for the PowerShell scripts to complete the step. You will note that this document refers to Exchange 2007, though the steps also apply to Exchange 2010, and to a Staged Exchange migration, when you are doing a Cutover. Further, the scripts are embedded in the text, so you have to copy and paste. Further, the scripts do not work if you try to follow the instructions exactly. There are several issues.

First, this step seems to be in the wrong place. You should change the MX records to route mail to Office 365, and then leave an interval of at least a few hours, before doing this step. The reason is that once you convert SBS users to mail-enabled users, the Migration tool will not be able to re-sync their mailbox. You must complete a sync immediately before doing the conversion. The only way I know to force a sync is to stop and then resume the Migration Batch. Check that all mailboxes are synced, which only takes a few minutes, before doing the conversion. You may still lose an email if it arrives in the window between the last sync and the conversion, which is why you should change the MX records first.

Second, if you run ExportO365UserInfo.ps1 in the Small Business Server Exchange Shell, it will not work, since “By default, Import-PSSession does not import commands that have the same name as commands in the current session.” This means that when the script runs mailbox commands they run against the local Exchange server rather than Office 365, unless you use the –AllowClobber parameter. I found the solution was to run this script on another machine.

Third, the script still does not work, since, in my case at least, the Migration Batch did not populate the onmicrosoft.com email address for imported users. I fixed this with a handy script.

Note that the second script, Exchange2007MBtoMEU.ps1, must be run in the SBS server Exchange Shell, otherwise it will not work.

Bearing in mind all these hazards, you might think that the whole, not strictly necessary, step of converting to mail-enabled users is not worth it. That is perfectly reasonable.

Finishing the job

Bearing in mind the above, the next steps do not altogether make sense. In particular, step 11, which says to make sure that:

“Office 365 mailboxes were synchronized at least once after mail began being sent directly to them. To do this, make sure that the value in the Last Synced Time box for the migration batch is more recent than when mail started being routed directly to Office 365 mailboxes.”

In fact, you will get errors here if you followed Step 7 to create mail-enabled users. Did anyone at Microsoft try to follow these steps?

Still, I have to say that the outcome in our case was excellent. Everything was copied correctly, and the Migration Batch tool even successfully replicated fiddly things like calendar permissions. The transition was smooth.

Note that you should not attempt to point an existing Outlook profile at the Office 365 Exchange. Instead, create a new profile. Otherwise I am not sure what happens; you probably get thousands of duplicate items.

One puzzle. I did not spot any duplicates in the synced mailboxes, but the item count increased by around 20% compared to the old mailboxes, as reported by PowerShell. Currently a mystery.

Closing words

I am puzzled that Microsoft does not have any guidance specifically for Small Business Server migrations, given how common these are, as well as by the poor and inaccurate documentation as noted above.

There are perhaps two factors at play. One is that Microsoft expects businesses of any size to use partners for this kind of work, who specialise in knowing the pitfalls. Second, the company seems so focused on enterprises that the needs of small businesses are neglected. Note, for example, the strong push for businesses to use the Azure AD Connect tool even though this requires a multi-server setup. There is a special tool in Windows Server Essentials, but this does not apply for businesses using a Standard edition of Small Business Server.

Finally, note that there are third-party tools you can use for this kind of migration, in particular BitTitan’s MigrationWiz, which may well be easier though a small cost is involved.

Microsoft at Ignite: Building on Office 365, getting more like Google, Adobe mysteries and FPGA magic

I’m just back from Microsoft’s Ignite event in Atlanta, Georgia, where around 23,000 attendees mostly in IT admin roles assembled to learn about the company’s platform.

There are always many different aspects to this type of event. The keynotes (there were two) are for news and marketing hype, while there is lots of solid technical content in the sessions, of which of course you can only attend a small fraction. There was also an impressive Expo at Ignite, well supported both by third parties and by Microsoft, though getting to it was a long walk and I fear some will never find it. If you go to one of these events, I recommend the Microsoft stands because there are normally some core team members hanging around each one and you can get excellent answers to questions as well as a chance to give them some feedback.

The high level story from Ignite is that the company is doing OK. The event was sold out and Corporate VP Brad Anderson assured me that many more tickets could have been sold, had the venue been bigger. The vibe was positive and it looks like Microsoft’s cloud transition is working, despite having to compete with Amazon on IaaS (Infrastructure as a service) and with Google on productivity and collaboration.

My theory here is that Microsoft’s cloud advantage is based on Office 365, of which the core product is hosted Exchange and the Office suite of applications licensed by subscription. The dominance of Exchange in business made the switch to Office 365 the obvious solution for many companies; as I noted in 2011, the reality is that many organisations are not ready to give up Word and Excel, Outlook and Active Directory. The move away from on-premises Exchange is also compelling, since running your own mail server is no fun, and at the small business end Microsoft has made it an expensive option following the demise of Small Business Server. Microsoft has also made Office 365 the best value option for businesses licensing desktop Office; in fact, I spoke to one attendee who is purchasing a large volume of Office 365 licenses purely for this reason, while still running Exchange on-premises. Office 365 lets users install Office on up to 5 PCs, Macs and mobile devices.

Office 365 is only the starting point of course. Once users are on Office 365 they are also on Azure Active Directory, which becomes a hugely useful single sign-on for cloud applications. Microsoft is now building a sophisticated security story around Azure AD. The company can also take advantage of the Office 365 customer base to sell related cloud services such as Dynamics CRM online. Integrating with Office 365 and/or Azure AD has also become a great opportunity for developers. If I had any kind of cloud-delivered business application, I would be working hard to get it into the Office Store and try to win a place on the newly refreshed Office App Launcher.

image

Office 365 users have had to put up with a certain amount of pain, mainly around the interaction between SharePoint online/OneDrive for Business and their local PC. There are signs that this is improving, and a key announcement made at Ignite by Jeff Teper is that SharePoint (which includes Team Sites) will be supported by the new generation sync client, which I hope means goodbye to the ever-problematic Groove client and a bit less confusion over competing OneDrive icons in the notification area.

A quick shout-out too for SharePoint Groups, despite its confusing name (how many different kinds of groups are there in Office 365?). Groups are ad-hoc collections of users which you set up for a project, department or role. Groups then have an automatic email distribution list, shared inbox, calendar, file library, OneNote notebook (a kind of Wiki) and a planning tool. Nothing you could not set up before, but packaged in a way that is easy to grasp. I was told that usage is soaring which does not surprise me.

I do not mean to diminish the importance of Azure, the cloud platform. Despite a few embarrassing outages, Microsoft has evolved the features of the service rapidly as well as building the necessary global infrastructure to support it. At Ignite, there were several announcements including new, more powerful virtual machines, IPv6 support, general availability of Azure DNS, faster networking up to an amazing 25 Gbps powered by FPGAs, and the public preview of a Web Application Firewall; the details are here:

My overall take on Azure? Microsoft has the physical infrastructure to compete with AWS though Amazon’s service is amazing, reliable and I suspect can be cheaper bearing in mind Amazon’s clever pricing options and lower price for application services like database management, message queuing, and so on. If you want to run Windows server and SQL server in the cloud Azure will likely be better value. Value is not everything though, and Microsoft has done a great job on making Azure accessible; with a developer hat on I love how easy it is to fire up VMs or deploy web applications via Visual Studio. Microsoft of course is busy building hooks to Azure into its products so that if you have System Center on-premises, for example, you will be constantly pushed towards Azure services (though note that the company has also added support for other public clouds in places).

There are some distinctive features in Microsoft’s cloud platform, not least the forthcoming Azure Stack, private cloud as an appliance.

I put “getting more like Google” in my headline, why is that? A couple of reasons. One is that CEO Satya Nadella focused his keynote on artificial intelligence (AI), which he described as “the ability to reason over large amounts of data and convert that into intelligence,” and then, “How we infuse every application, Cortana, Office 365, Dynamics 365 with intelligence.” He went on to describe Cortana (that personal agent that gets a bit in the way in Windows 10) as “the third run time … it’s what helps mediate the human computer interaction.” Cortana, he added, “knows you deeply. It knows your context, your family, your work. It knows the world. It is unbounded. In other words, it’s about you, it’s not about any one device. It goes wherever you go.”

I have heard this kind of speech before, but from Google’s Eric Schmidt rather than from Microsoft. While on the consumer side Google is better at making this work, there is an opportunity in a business context for Microsoft based on Office 365 and perhaps the forthcoming LinkedIn acquisition; but clearly both companies are going down the track of mining data in order to deliver more helpful and customized experiences.

It is also noticeable that Office 365 is now delivering increasing numbers of features that cannot be replicated on-premises, or that may come to on-premises one day but Office 365 users get them first. Further, Microsoft is putting significant effort into improving the in-browser experience, rather than pushing users towards Windows applications as you might have expected a few years back. It is cloud customers who are now getting the best from Microsoft.

While Microsoft is getting more like Google, I do not mean to say that it is like Google. The business model is different, with Microsoft’s based on paid licenses versus Google’s primarily advertising model. Microsoft straddles cloud and on-premises whereas Google has something close to a pure cloud play – there is Android, but that drives advertising and cloud services rather than being a profit centre in itself. And so on.

There were a couple more notable events during Nadella’s keynote.

image
Distinguished Engineer Doug Burger and one of Microsoft’s custom FPGA boards.

One was Distinguished Engineer Doug Burger’s demonstration of the power of FPGA boards which have been added to Azure servers, sitting between the servers and the network so they can operate in part independently from their hosts (see my short interview with Burger here).

During the keynote, he gave what he called a “visual demo” of the impact of these FPGA accelerators on Azure’s processing power. First we saw accelerated image recognition. Then a translation example, using Tolstoy’s War and Peace as a demo:

image

The FPGA-enabled server consumed less power but performed the translation 8 times faster. The best was to come though. What about translating the whole of English Wikipedia? “I’ll show you what would happen if we were to throw most of our existing global deployment at it,” said Burger.

image

“Less than a tenth of a second” was the answer. Looking at that screen showing 1 Exa-op felt like being present at the beginning of a computing revolution. As the Top500 supercomputing site observes, “the fact the Microsoft has essentially built the world’s first exascale computer is quite an achievement.” Exascale is a billion billion operations per second.

However, did we see Wikipedia translated, or just an animation? Bearing in mind first, that Burger spoke of “what would happen”, and second, that the screen says “Estimated time”, and third, that the design of Azure’s FPGA network (as I understand it) means that utilising it could impact other users of the service (since all network traffic to the hosts goes through these boards), it seems that we saw a projected result and not an actual result – which means we should be sceptical about whether this would actually work as advertised, though it remains amazing.

One more puzzle before I wrap up. Adobe CEO Shantanu Narayen appeared on stage with Nadella, in the morning keynote, to announce that Adobe will make Azure its “preferred cloud.” This appears to include moving Adobe’s core cloud services from Amazon Web Services, where they currently run, to Azure. Narayen:

“we’re thrilled and excited to be announcing that we are going to be delivering all of our clouds, the Adobe Document Cloud, the Marketing Cloud and the Creative Cloud, on Azure, and it’s going to be our preferred way of bringing all of this innovation to market.”

Narayen said that Adobe’s decision was based on Microsoft’s work in machine learning and intelligence. He also looked forward to integrating with Dynamics CRM for “one unified and integrated sales and marketing service.”

This seems to me interesting in all sorts of ways, not only as a coup for Microsoft’s cloud platform versus AWS, but also as a case study in migrating cloud services from one public cloud to another. But what exactly is Adobe doing? I received the following statement from an AWS spokesperson:

“We have a significant, long-term relationship and agreement with Adobe that hasn’t changed. Their customers will want to use AWS, and they’re committed to continuing to make that easy.”

It does seem strange to me that Adobe would want to move such a significant cloud deployment, that as far as I know works well. I am trying to find out more.

Microsoft pivot: Ignite is now its key conference

I have been covering Microsoft for quite a few years and it was always clear to me that the must-attend event, if you want to keep up with the company, was the Professional Developer Conference (PDC), and after that was scrapped, its successor developer event Build.

The reason for this is that at PDC or Build the company gives in-depth presentations on the latest features of its developer platform. Pivotal events that I recall include PDC 2003 where we learned about the “Three pillars of Longhorn”, PDC 2008 where Windows 7 was previewed, and Build 2011 where Windows 8 was unveiled.

Two of these three worked out badly for the company, and one fantastically well. The three pillars of Longhorn became the two pillars of Vista after a notorious “reset” of Windows development, while Windows 8 was so hated in the PC community that Microsoft retreated to the more familiar and desktop-oriented Windows 10 a few years later.

Windows 7 on the other hand was such a success that even today, more than a year after the release of Windows 10, many PCs ship with Windows 7 pre-loaded and 10 as an upgrade option. Well, maybe that is a sign of failure (of the later versions) rather than success; but however you choose to spin it, it has been hugely popular.

Perhaps I should also mention PDC 2000 where the .NET Framework was announced (strictly, it was announced at TechEd Europe the previous week, but I digress). That one worked out pretty well I guess, though not without internal conflict between the C++ folk and the .NET Folk – played out in both the Longhorn story and the Windows 8 story.

image

The reason though why these events were so strategically revealing was that nothing at Microsoft mattered more than the direction of Windows. These events were about informing and attracting developers to the Windows platform.

Alongside its developer events, Microsoft has always held others aimed more at system administrators, events like TechEd (especially the USA variant), MEC (Microsoft Exchange Conference) and Microsoft Management Summit (last held in 2013). While always interesting, it seemed to me that these IT Admin events were less strategic than the developer events, because the Windows platform was the foundation of the company’s business and it was at the developer events that you saw this platform evolve.

In August 2013 Microsoft co-founder Steve Ballmer stepped down as CEO, to be replaced by server guy Satya Nadella, accelerating the company’s pivot away from Windows and towards Office 365 and Azure as its key platform. Microsoft’s cloud runs on Windows of course, even if many of the VMs on Azure end up running Linux, but the company is now keen to emphasize its support for any operating system – or to be more precise, Windows, Mac, iOS, Android, and that vague thing IoT – presumably on the basis that broad endpoint support makes its cloud offering more compelling.

I could write screeds about Windows 10 and its evolution, about which I have mixed feelings. Windows for sure remains critically important to Microsoft, and indeed to all of us who feel that it meets needs that its competition does not address. (The answer is not always “just use a Mac”, if only because of Apple’s addiction to premium pricing and high profit margins).

Nevertheless, it is the cloud and hybrid cloud offerings that come first in today’s Microsoft, and Windows server rather than Windows 10 that is more strategically important.

That is why Microsoft Ignite, which starts on Monday 26th September 2016 and is aimed primarily at IT administrators, that is now the key event. Here we will see the formal launch of Server 2016 as well as Azure and Office 365 news; and I plan to pay close attention.

UK South or UK West? Microsoft opens new data centres for Azure and Office 365

Microsoft has opened “multiple data centre locations in the UK” to run Azure and Office 365 cloud services.

I went to the Azure portal to create a new VM, to see the new options. It looks like you have to use the new portal. Here is what I got in the old portal:

image

In the new one though, I can choose between UK South and UK West.

image

An Azure region is composed of multiple data centres so this looks like a substantial investment. According to this document, the new regions are located in Cardiff and London.

image

The new infrastructure supports Azure and Office 365 today, with Dynamics CRM Online promised for the “first half of 2017”, according to the announcement.

Early customers are the Ministry of Defence, South London and Maudsley NHS Foundation Trust, Aston Martin, Capita and Rosslyn Analytics.

The announcement will help Microsoft and its partners sell these services to UK businesses concerned about compliance issues; there may also be some latency benefit. That said, Microsoft is a US corporation and the US government has argued that it can access this data with only a US search warrant. Microsoft has resisted this and won an appeal in July 2016; however there could always be new legislation. There is no simple answer.

Amazon Web Services has also announced plans for UK data centres; in fact, AWS was the first to reveal plans, but Microsoft has been quicker with implementation.

Notes from the field: Office 365 pain following Windows 10 upgrade

I got involved in looking at a PC where a few Office 365 problems had arisen following an upgrade to Windows 10 (prompted by Microsoft supposedly ending its free upgrade offer).

In particular, SharePoint online was crashing Internet Explorer. Internet Explorer? Don’t Windows 10 users stick to Edge?

Unfortunately Edge is problematic with certain sites. It works OK with Office 365 but there are some issues. For example, open a SharePoint document library in IE and you get the very useful option to “Open with Explorer”, an Explorer UI for your cloud-hosted files.

image

Try this in Edge and you get:

image

Note how the help information does not tell you how to fix the problem.

For reasons like this, the user still had a shortcut to SharePoint online in IE on the Windows 10 taskbar. Click it though, and IE would crash with its “Internet Explorer has stopped working” dialog.

Probably an add-on, I thought. This was proved right when I opened IE with add-ons disabled – try running:

"%ProgramFiles%\Internet Explorer\iexplore.exe" –extoff

– and found that SharePoint online worked fine. After some experimentation, I discovered that the SharePoint Export Database Launcher add-on was causing the problem. Disabled it and SharePoint worked fine.

image

This add-on is installed by Microsoft Office. It prompts a couple of thoughts.

I do not know if every Windows 10 PC is similarly afflicted, but problems like this do suggest a lack of quality control in some areas. It is also unfortunate that when you install Office 365 Professional Plus you do not get any options; you get everything. Including, in this case, a buggy add-on.

Second, I wish Microsoft would pause from its energetic feature work with Office 365 and sort out the core functionality of working with documents in SharePoint online. As someone pointed out to me on Twitter today, the situation with OneDrive sync clients remains a mess, and when it goes wrong it is not always easy to troubleshoot.

Incidentally, I cannot resist telling you how to fix another OneDrive for Business issue. Here’s the problem: you open a document library in a web browser (even works in Edge), hit Sync, and OneDrive for Business fires up. If this is the first document library to be synced you might be prompted to sign in. So you enter your email address, hit Next, and then enter your password and click Sign in. Sometimes though nothing happens and you can’t sign in. What’s the fix? Don’t click Sign-in, press Enter!

AWS Summit London 2016: no news but strong content, and a little bit of Echo

I attended day two (the developer day) of the Amazon Web Services Summit at the ExCel conference centre in London yesterday. A few quick observations.

It was a big event. I am not sure how many attended but heard “10,000” being muttered. I was there last year as well, and the growth was obvious. The exhibition has spilled out of its space to occupy part of an upper mezzanine floor as well. The main auditorium was packed.

image

Amazon does not normally announce much news at these events, and this one conformed to the pattern. It is a secretive company when it comes to future plans. The closest thing to news was when AWS UK and Ireland MD Gavin Jackson said that Amazon will go ahead with its UK region despite the referendum on leaving the EU.

CTO Dr Werner Vogels gave a keynote. It was mostly marketing which disappointed me, since Vogels is a technical guy with lots he could have said about AWS technology, but hey, this was a free event so what do you expect? That said, the latter part of the keynote was more interesting, when he talked about different models of cloud computing, and I will be writing this up for the Register shortly.

Otherwise this was a good example of a vendor technical conference, with plenty of how-to sessions that would be helpful to anyone getting started with AWS. The level of the sessions I attended was fairly high, even the ones described as “deep dive”, but you could always approach the speaker afterwards with your trickier issues. The event was just as good as some others for which you have to pay a fee.

The sessions I attended on DevOps, containers, microservices, and AWS Lambda (serverless computing) were all packed, with containers perhaps drawing the biggest crowd.

At the end of the day I went to a smaller session on programming for Amazon Echo, the home voice control device which you cannot get in the UK. The speaker refused to be drawn on when we might get it, but I suppose the fact that Amazon ran the session suggests that it will appear in the not too distant future. I found this session though-provoking. It was all about how to register a keyword with Amazon so that when a user says “Alexa what’s new with [mystuff]” then the mystuff service will be invoked. Amazon’s service will send your service the keywords (defined by you) that it detects in the question or interaction and you send back a response. The trigger word – called the Invocation Name – has to be registered with Amazon and I imagine there could be big competition for valuable ones. It is all rather limited at the moment; you cannot create a commercial service, for example, not even for ordering pizzas. Check out the Alexa Skills Kit for more.

Presuming commercial usage does come, there are some interesting issues around identity, authentication, and preventing unauthorised or inappropriate use. Echo does allow ordering from Amazon, and you can optionally set a voice PIN, but I would have thought a voice PIN is not much use if you want to stop children ordering stuff, for example, since they will hear it. If you watch your email, you would see the confirming email from Amazon and could quickly cancel if it were a problem. The security here seems weak though; it would be better to have an approval text sent to a mobile, for example, so that there is some real control.

Overall, AWS is still on a roll and I did not hear a single thing about security concerns or the risks of putting all your eggs in Amazon’s basket. I wonder if fears have gone from being over blown to under recognized? In the end these considerations are not quantifiable which makes risks hard to assess.

I could not help but contrast this AWS event to one I attended on Microsoft Azure last month. AzureCraft benefited from the presence of corporate VP Scott Guthrie but it was a tiny event in comparison to Amazon’s effort. If Microsoft is serious about competing with AWS it needs to rethink its events and put them on directly rather than working through user groups that have a narrow membership (AzureCraft was up on by the UK Azure User Group).

The case of the disappearing Azure AD application registration

Some time ago I wrote a simple web application which runs on Microsoft Azure and uses Azure Active Directory for authentication. The application is used constantly and has proved reliable; however yesterday it stopped working. A quick debug session showed that the problem was an Azure AD permissions error.

In order to use Azure AD, applications have to be registered in the Azure management portal. I use the old portal for this; I am not sure that the functionality exists in the new portal yet. There is a nice how-to here.

image

One of the elements in the registration is a key which has a maximum lifetime of 2 years:

image

My application was deployed about two years ago so I went to the portal to see if it had expired.

What I found surprised me. The application was not listed at all. It had disappeared.

Instead of simply obtaining a new key and updating my application config, I had to create a new application registration and update several keys in the config, which was an annoyance.

There is a wider point here, in the whole category of dealing with “things that expire”. Some time ago, Microsoft suffered an extended Azure outage because of an expired certificate. It is a shame that Microsoft insists on a maximum 2 year lifetime for this key but does not provide a check box for “alert me when this key is about to expire”, how difficult would that be?

Problems like this also mean that things which “just work” may not continue to do so. Of course a well organised enterprise setup can deal with this type of problem, but imagine, for example, the case of a small business with an application running on Azure where the developers have gone out of business, perhaps, or are no longer available. In fact the only code I needed to change was in web.config, but I can imagine it could take some time to figure out what to do and what to change.

Outlook 2016 attachment mysteries and annoyances

Microsoft Outlook 2016 has a new feature which the company highlighted when it first appeared, which is that it sends attachments as links by default, if they are stored in network-accessible locations. The idea is to prevent proliferation of different versions if several respondents make changes and email them back. It also means that everyone has the latest version. Good stuff, right?

I am not sure. Of course Outlook is meant to give you the choice about whether to send as a link or as a copy, but we all know that busy people just click and expect it to work; they mostly will not think through which method is appropriate in a particular case, or in some cases, even understand the difference. One of the implications of sending links is that the document received may not be what is sent. For example, consider this scenario:

1. Hmm, shall I send the minutes of our last meeting to this person at supplier X? Better check there is nothing sensitive in it. [Checks]. OK, send.

2. Colleague happens to look at minutes, thinks, why did we not minute our difficulties with supplier X? Adds section of sensitive information and proposal to switch to supplier Y.

3. Person at supplier X receives document …

OK, my scenario is somewhat contrived, but you can see the underlying issue.

There is also the question of whether the mechanism behind this feature is really robust. It is not in fact a simple feature. What is meant to happen is that Outlook detects whether your document can be sent as a link, and if it can, interacts with SharePoint to create a magic link with either view or edit permissions. In my experience, it is easy to end up sending an attachment that cannot in fact be accessed by the person at the other end.

I have an internal SharePoint and soon figured out that I had to prevent Outlook from sending documents as links. The URL I use for SharePoint internally is not accessible externally, which is perhaps a flaw in my setup, but not one that has ever caused problems before. In any case, I would prefer not to give out any magic links to documents in my SharePoint; it just seems an unnecessary security risk.

In the case of Office 365, note that external sharing may be switched off, in which case links will not work. External sharing may also be disabled for specific sites.

image

Maybe Outlook 2016 is smart enough to detect whether or not external sharing is enabled, but if so, this does seem to go wrong sometimes. I have seen cases where users send an attachment link, but the recipient cannot access the document. Rather, they click the link and get a “can’t be found in directory” error or similar.

image

Another issue is that Outlook 2016 does not always offer you the choice of link or attachment. Here is how it is meant to work. What happens sometimes though is that the attachment does not end up in the “attached” header at the top of the email, but rather in the body. In this scenario, you actually end up with a small Word table (Outlook messages use the Word editor) that cannot be converted into a standard attachment:

image

Note the little icon, an embedded image, which includes a cloud to give you a clue that this is not really attached. It also seems to mess up text formatting; note that my typing is now Times New Roman rather than Calibri. Another Outlook mystery.

This problem only seems to happen if you select a file from Outlook 2016’s recently accessed document list, which appears when you click the new Attach File button:

image

So how do you prevent this behaviour? Given the difficulties it can cause, I thought Outlook might have an option to disable sending attachments as links, or at least to prevent it happening by default. I have not found such an option yet. One point to bear in mind is that in previous versions of Outlook it was not easy to send a document from SharePoint at all, unless you could access it from Windows Explorer. This means using WebDAV (“Open in Explorer”), or the still-problematic OneDrive for Business client. So the dropdown with recently accessed SharePoint and OneDrive documents is new and potentially welcome functionality.

Here are a couple of workarounds though. If you format an email as plain text, which you can set as default if you choose, then you will not get the embedded link that cannot be changed. Instead, you will get the dialog with options to link or attach a copy:

image

What if you want Outlook 2016 to behave like Outlook 2013 and earlier? Well, the Attach File with the dropdown is not customizable directly, but you can add an old-style Attach File button. To do this, start a new email, right-click the toolbar, and click Customize the Ribbon. Right-lick the New Mail Message section on the right, and choose Add new group. Then select the Attach File command on the left, and the new group on the right, and click Add. I have called my new group Custom:

image

The effect is that you now have two Attach File commands, one of which behaves just like Outlook 2013:

image

My custom Attach File is on the right in the image above, does not have a drop-down list, and simply selects a file using an insert file dialog.

I appreciate that these are workarounds and not complete solutions.

Did Microsoft really think through this feature? Why the bugs? Why no easy way to disable it? I wish I knew.