Category Archives: cloud computing

Cloud storage sums: how does the cost compare to backing up to your own drives?

Google now offers Cloud Storage Nearline (CSN) at $0.01 per GB per month.

Let’s say you have 1TB of data to store. That will cost $10 per month to store. Getting the data there is free if you have unlimited broadband, but getting it all back out (in the event of a disaster) costs $0.12 per GB ie $120.

A 1TB external drive is around £45 or $58 (quick prices from Amazon for USB 3.0 drives). CSN is not an alternative to local storage, but a backup; you will still have something like network attached storage preferably with RAID resilience to actually use the data day to day. The 1TB external drive would be your additional and preferably off-site backup. For the $120 per annum that CSN will cost you can buy two or three of these.

The advantage of the CSN solution is that it is off-site without the hassle of managing off-site drives and probably more secure (cloud hack risks vs chances of leaving a backup drive in a bus or taxi, or having it nabbed from a car, say). Your 1TB drive could go clunk, whereas Google will manage resilience.

If you consider the possibilities for automation, a cloud-based backup is more amenable to this, unless you have the luxury of a connection to some other office or datacentre.

Still, even at these low prices you are paying a premium versus a DIY solution. And let’s not forget performance; anyone still on ADSL or other asymmetric connections will struggle with large uploads (typically 1-2 Mb/s) while USB 3.0 is pretty fast (typically up to 100 Mb/s though theoretically it could be much faster). If you have the misfortune to have data that changes frequently – and a difficult case is the VHDs (Virtual Hard Drives) that back Virtual Machines – then cloud backup becomes difficult.

Microsoft financials Jan-March 2015

Microsoft has released figures for its third quarter, ending March 31st 2015. Here is my simple summary of the figures showing the segment breakdown:

Quarter ending  March 31st 2015 vs quarter ending March 31st 2014, $millions

Segment Revenue Change Gross margin Change
Devices and Consumer Licensing 3476 -1121 3210 -807
Computing and Gaming Hardware 1800 -72 414 +156
Phone Hardware 1397 N/A -4 N/A
Devices and Consumer Other 2280 +456 566 +175
Commercial Licensing 10036 -299 9975 -157
Commercial Other 2760 +858 1144 +669

The figures form a familiar pattern: Windows and shrink-wrap (non-subscription) Office is down, reflecting weak PC sales and the advent of free Windows at the low end, but subscription sales are up and cloud is booming. See the foot of this post for an explanation of Microsoft’s confusing segment breakdown.

Microsoft says that Surface Pro 3 is doing well (revenue of $713 million) and this is reflected in the Devices figures. Commercial cloud (Office 365, Azure and Dynamics) is up 106% year on year.

Cloud aside, it is impressive that server products reported a 12% year on year increase in revenue. This is the kind of business that you would expect to be hit by cloud migration, though I am not sure how Microsoft accounts for things like SQL Server licenses deployed on Azure.

Xbox One is disappointing, bearing in mind the success of the Xbox 360. Microsoft managed to lose out to Sony’s PlayStation 4 with its botched launch and market share will be hard to claw back.

Microsoft reports 8.6 million Lumias sold, the majority being low-end devices. Not too bad for a platform many dismiss, but still treading water and miles behind iOS and Android.

The company remains a huge money-making machine though, and Office 365 is doing well. A few years ago it looked as if cloud and mobile could destroy Microsoft, but so far that is not the case at all, though its business is changing.

Microsoft’s segments summarised

Devices and Consumer Licensing: non-volume and non-subscription licensing of Windows, Office, Windows Phone, and “ related patent licensing; and certain other patent licensing revenue” – all those Android royalties?

Computing and Gaming Hardware: the Xbox One and 360, Xbox Live subscriptions, Surface, and Microsoft PC accessories.

Devices and Consumer Other: Resale, including Windows Store, Xbox Live transactions (other than subscriptions), Windows Phone Marketplace; search advertising; display advertising; Office 365 Home Premium subscriptions; Microsoft Studios (games), retail stores.

Commercial Licensing: server products, including Windows Server, Microsoft SQL Server, Visual Studio, System Center, and Windows Embedded; volume licensing of Windows, Office, Exchange, SharePoint, and Lync; Microsoft Dynamics business solutions, excluding Dynamics CRM Online; Skype.

Commercial Other: Enterprise Services, including support and consulting; Office 365 (excluding Office 365 Home Premium), other Microsoft Office online offerings, and Dynamics CRM Online; Windows Azure.

Reflections on BoxDEV: keeping ahead of SharePoint, Changing Microsoft, and Eric Schmidt on Surveillance

Earlier this week I attended BoxDEV in San Francisco, along with around 1500 developers and some illustrious guests: Eric Schmidt from Google and Marc Benioff from Salesforce.

Schmidt was interviewed by Box CEO Aaron Levie. “Randomly watching and surveilling what’s going over the internet and invading the privacy of American citizens is not OK.” said Schmidt; but he was not talking about Google, rather about the NSA. “Encryption is the solution” he said. It was all rather bizarre, as the king of data gatherers promised to protect our privacy, but he also made some fun comments about how most enterprise IT spend (90-95%) goes on legacy systems that will be replaced by cloud and mobile. A future in which two or three companies run all the world’s IT? Some more quotes on the Reg here.

image

What of Box though? Most people (individuals that is) probably think of Box as a cloud storage competitor and an alternative to DropBox, OneDrive or Google Drive. So it is; but the company sees itself as an enterprise collaboration platform rather than a commodity storage provider. Levie expressed the company’s intended differentiation neatly in the closing Q&A:

We build enterprise software with consumer grade experiences. There is not a lot of competition.

Box is working hard at ticking compliance boxes (sorry) and providing a service with which enterprises are comfortable.

This wasn’t a big event for announcements, but the company did present Box Developer Edition; a confusing name in my opinion as it sounds like a Box account for test and development but it is not; it is a new type of Box application where you can provision your own users. The system creates shadow Box users under the covers, but it gives the illusion of a fully custom Box platform.

There are also new mobile SDKs. Box has split its SDK into modules, covering Content (the core), Browse, Share and Preview. The latter three include UI components as well as a non-visual wrapper for what is ultimately a REST API into its system. The Content and Browse SDKs support Windows Phone as well as iOS and Android, but Share and Preview are iOS/Android only. Microsoft is a Box partner though (and a sponsor of the event); it would not surprise me to see a Universal App SDK from Box in due course.

The Box View API is particularly interesting since it lets you render the content of numerous file formats as HTML. Getting this sort of stuff to work correctly is a challenge and it does add a lot of value to content-oriented applications. There are limitations. For example, if you have a PowerPoint document with an embedded video, the video will not render. There is some impressive technology here though, and Box is focusing on further improving it with acquisitions like that of Verold last week, which brings interactive 3D viewing to the platform.

I had a chat with Senior VP of Engineering Sam Schillace – he was one of the founders of Writely, a web-based word processor which was acquired by Google, and ported from C# to become the basis for an important part of Google Docs. Schillace is intimately familiar with the challenges of working with Microsoft Office formats and I’ve written up some of his remarks for the Register in a piece which will appear shortly.

I was also interested to note how many of the features of Box are also in SharePoint and Office 365. Again, I’ve covered this for the Register, but will say that it is just as well for Box that parts of SharePoint remain problematic, particularly the desktop sync aspect. The complexity of SharePoint is another issue. Box does less than SharePoint in many respects (there is no equivalent to the Office Web Apps, for example, which let you edit in the browser) but if Box does less, but more reliably and with a better user experience, it can still succeed. On the other hand, if Microsoft manages to get SharePoint working really sweetly, particularly in its Office 365 guise, it will be tough for Box to compete, especially as Microsoft builds features like Office Delve which does intelligent search in SharePoint online and hooks into Office 365/Azure Active Directory groups and “social signals” from other parts of Office 365 such as Yammer.

This is a young, smart company though and capable of keeping ahead if it remains nimble.

Finally I must mention the closing Q&A with Levie. There were plenty of daft questions, one of which was “Did Microsoft sponsor BoxDEV so that you couldn’t make fun of them this year?”

“Satya Nadella isn’t as funny as Steve Ballmer,” says Levie, but added, “we have changed how we talk about Microsoft because Microsoft has changed as a company.”

A though-provoking remark on the eve of Build, which is on next week here in San Francisco.

image

Notes from the field: when Outlook 2010 cannot connect to Office 365

If you set up a PC to connect to Office 365, you may encounter a problem where instead of connecting, Outlook repeatedly prompts for a password – even when you have entered all the details correctly.

I hit this issue when configuring Outlook 2010 on a new PC. It was not easy to find the solution, as most technical help documents suggest that this is either a problem with the autodiscover records in DNS (not so in this case), or that you can fix it with manual configuration of the connection properties (also not so in this case).

Note that if you are using Office 2010, you should install the desktop setup software from Office 365 before trying to configure Outlook. However this still did not work.

The clue for me was when I noticed that Outlook 2010 was missing a setting in network security for Anonymous Authentication.

image

In order to fix this, I installed Office 2010 Service Pack 2, and it started working. The problem is that if you set up a new PC using an Office 2010 DVD, it takes a while before everything is up to date.

I heard of another business that had this problem and decided to upgrade their Office 365 subscription to include the latest version of Office, rather than figuring out how to fix it. Now that plans including desktop Office are reasonably priced, this strikes me as a sensible option.

Microsoft publishes new OneDrive API with SDK, sample apps

Microsoft has announced a new OneDrive API for programmatic access to its cloud storage service. It is a REST API which Microsoft Program Manager Ryan Gregg says the company is also using internally for OneDrive apps. The new API replaces the previous Live SDK, though the Live SDK will continue to be supported. One advantage of the new API is that you can retrieve changes to files and folders in order to keep an offline copy in sync, or to upload changes made offline.

Unfortunately this does not extend to only downloading the changed part of a file (as far as I can tell); you still have to delete and replace the entire file. Imagine you had a music file in which only the metadata had changed. With the OneDrive API, you will have to upload or download the entire file, rather than simply applying the difference. However, you can upload files in segments in order to handle large files, up to 10GB.

I have worked with file upload and download using the Azure Blob Storage service so I was interested to see what is now on offer for OneDrive. I went along to the OneDrive API site on GitHub and downloaded the Windows/C# API explorer, which is a Windows Forms application (why not WPF?). This uses a OneDrive SDK library which has been coded as a portable class library, for use in desktop, Windows 8, Windows Phone 8.1 and Windows Phone Silverlight 8.

image

I have to say this is not the kind of sample I like. I prefer short snippets of code that demonstrate things like: here is how you authenticate, here is how you iterate through all the files in a folder, here is how you download a file, here is how you upload a file, and so on. All these features are there in this app, but finding them means weaving your way through all the UI code and async calls to piece together how it actually works. On top of that, despite all those async calls, there are some performance issues which seem to be related to the smart tiles which display a preview image, where possible, from each file and folder. I found the UI becoming unresponsive at times, for example when retrieving my large SkyDrive camera roll.

Gregg makes no reference in his post to OneDrive for Business, but my assumption is that the new API only applies to consumer OneDrive. Microsoft has said though that it intends to unify its two OneDrive services so maybe a future version will be able to target both.

At a quick glance the API looks different to the Azure Blob Storage API. They are different services but with some overlap in terms of features and I wonder if Microsoft has ever got all its cloud storage teams together to work out a common approach to their respective APIs.

I do not intend to be negative. OneDrive is an impressive and mostly free service and the API is important for lots of reasons. If you find the OneDrive integration in the current Windows 10 preview too limited (as I do), at least you now have the ability to code your own alternative.

Microsoft risks enterprise credibility by pushing out insecure mobile Outlook

One thing about Microsoft: it may not be the greatest for usability or convenience, but it does understand enterprise requirements around compliance and protecting corporate data.

At least, I thought it did.

That confidence has been undermined by the release yesterday of new “Outlook” mobile apps for iOS and Android.

I read the cheery blog posts from Office PM Julia White and from new Outlook GM Javier Soltero. “Now, with Outlook, you really can manage your work and personal email on your phone and tablet – as efficiently as you do on your computer,” says White.

There is a snag though. The new Outlook apps are rebadged Acompli apps, Acompli being a company acquired by Microsoft in early December 2014. Acompli, when it thought about how to create user-friendly email apps that connected to multiple accounts, came up with a solution which, as I understand it, looks like this:

  1. User gives us credentials for accessing email account
  2. We store those credentials in our cloud servers – except they are not really our servers, they are virtual machines on Amazon Web Services (AWS)
  3. Our server app grabs your email and we push it down to the app

A reasonable approach? Well, it simplifies the mobile app and means that the server component does all the hard work of dealing with multiple accounts and mail formats; and of course everything is described as “secure”.

However, there are several issues with this from a security and compliance perspective:

  1. From the perspective of the email provider, the app accessing the email is on the server, not on the device, and the server app may push the emails to multiple devices. That means no per-device access control.
  2. Storing credentials anywhere in a third-party cloud is a big deal. In the case of Exchange, they are Active Directory credentials, which means that if they were compromised, the hacker would potentially get access not only to email, but to anything for which the user has permission on that Active Directory domain.
  3. If an organisation has a policy of running servers on its own premises, it is unlikely to want credentials and email cached on the AWS cloud.

The best source of information is this post A Deeper look at Outlook on iOS and Android, and specifically, the comments. Microsoft’s Jon Orton confirms the architecture described above, which is also described in the Acompli privacy policy:

Our service retrieves your incoming and outgoing email messages and securely pushes them to the app on your device. Similarly, the service retrieves the calendar data and address book contacts associated with your email account and securely pushes those to the app on your device. Those messages, calendar events, and contacts, along with their associated metadata, may be temporarily stored and indexed securely both in our servers and locally on the app on your device. If your emails have attachments and you request to open them in our app, the service retrieves them from the mail server, securely stores them temporarily on our servers, and delivers them to the app … If you decide to sign up to use the service, you will need to create an account. That requires that you provide the email address(es) that you want to access with our service. Some email accounts (ones that use Microsoft Exchange, for example) also require that you provide your email login credentials, including your username, password, server URL, and server domain. Other accounts (Google Gmail accounts, for example) use the OAuth authorization mechanism which does not require us to access or store your password.

image

The only solution offered by Microsoft is to block the new apps using Exchange ActiveSync policy rules.

The new apps do not even respect Exchange ActiveSync policies – presumably hard to enforce given the architecture described above – though Microsoft’s AllenFilush says:

Outlook is wired up to work with Active Sync policies, but it currently only supports Remote Wipe (a selective wipe of the corporate data, not a device wipe). We will be adding full support for EAS policies like PIN lock soon.

However a user remarks:

Also, i have set up a test account, and performed a remote wipe, and nothing happened. I also removed the mobile device partnership later and still able to send and receive emails.

The inability to enforce a PIN lock means that if a device is stolen, the recipient might be able simply to turn on the device and read the corporate email.

The disappointment here is that Microsoft held to a higher standard for security and compliance than its competitors, more perhaps than some realise, with things like Bitlocker encryption built into Surface and Windows Phone devices.

Now the company seems willing to throw that reputation away for the sake of getting a consumer-friendly mobile app out of the door quickly. Worse still, it has been left to the community to identify and publicise the problems, leaving admins now racing to put the necessary blocks in place. If Microsoft was determined to do this, it should at least have forewarned administrators so that corporate data could be protected.

Reserved IPs and other Microsoft Azure annoyances

I have been doing a little work with Microsoft’s Azure platform recently. A common requirement is that you want a VM which is internet-accessible with a custom domain, for which the best solution is to create a A record in your DNS pointing to the IP number of the VM. In order to do this reliably, you need to reserve an IP number for the VM; otherwise Azure may assign a different IP number if you shut it down and later restart it. If you keep it running you can keep the IP number, but this also means you are have to pay for the VM continuously.

Azure now offers reserved IP numbers. Useful; but note that you can only link a VM with a reserved IP number when it is created, and to do this you have to create the VM with PowerShell.

What if you want to assign a reserved IP number to an existing VM? One suggestion is that you can capture an image from the VM, and then create a new VM from the image, complete with reserved IP. I went partially down this route but came unstuck because Azure for some reason captured the image into a different region (West Europe) than the region where the VM used to be (North Europe). When I ran the magic PowerShell script, it complained that the image was in the wrong region. I then found a post explaining how to move images between regions, which I did, but the metadata of the moved image was not quite the same and creating a new VM from the image did not work. At this point I realised that it would  be easier to recreate the VM from scratch.

Note that when reserved IP number were announced in May 2014, program manager Mahesh Thiagarajan said:

The platform doesn’t support reserving the IP address of the existing Cloud Services or Virtual machines. We expect to announce support for this in the near future.

You can debate what is meant by “near future” and whether Microsoft has already failed this expectation.

There is another wrinkle here that I am not clear about. Some Azure VMs have special pricing, such as those with SQL Server pre-installed. The special pricing is substantial, often forming the largest part of the price, since it includes licensing fees. What happens to the special pricing if you fiddle with cloning VMs, creating new VMs with existing VHDs, moving VMs between regions, or the like? If the special pricing is somehow lost, how do you restore it so SQL Server (for example) is still properly licensed? I imagine this would mean a call to support. I have not seen any documentation relating to this in posts like this about moving a virtual machine into a virtual network.

And there’s another thing. If you want your VM to be in a virtual network, you have to do that when you create it as well; it is a similar problem.

While I am in complaining mode, here is another. Creating a VM with PowerShell is easy enough, but you do need to know the image name you are using. This is not shown in the friendly portal GUI:

image

In order to get the image names, I ran a PowerShell script that exports the available images to a file. I was surprised how many there are: the resulting output has around 13,500 lines and finding what you want is tedious.

Azure is mostly very good in my experience, but I would like to see these annoyances fixed. I would be interested to hear of other things that make the cloud admin or developer’s life harder than it should be.

So that was 2014: Samsung stumbles, all change for Microsoft, Sony hack, more cloud, more mobile

What happened in 2014? One thing I did not predict is that Samsung lost its momentum. Here are Gartner’s figures for global smartphone sales by vendor, for the third quarter of 2014:

image

Samsung is still huge, of course. But in 2013, Samsung seemed to be in such control of its premium brand that it could shape Android as it wished, rather than being merely an OEM for Google’s operating system. In the enterprise, Samsung KNOX held promise as a way to bring security and manageability to Android, but only in Samsung’s flavour. Today, that seems less likely. Market share is declining, and much of KNOX has been rolled into Android Lollipop. What is going wrong? The difficulty for Samsung is how to differentiate its products sufficiently, to avoid bleeding market share to keenly priced competition from vendors such as Xiaomi and Huawei. This is difficult if you do not control the operating system.

What of the overall mobile OS wars? 2013 brought few surprises: the Apple/Android duopoly continued, Blackberry further diminished its share, and Windows Phone struggles on, though it was not looking good for Microsoft’s OS as 2013 closed; the Nokia acquisition may have been fumbled.

All change at Microsoft

That brings me to Microsoft, a company I watch closely. 2014 saw Satya Nadella appointed as CEO and several strategic changes, though the extent to which Nadella introduced those changes is uncertain. What changes?

Office is going truly cross-platform, with first-class support for iOS and Android. I covered this recently on the Register; the summary is that there will be mobile versions of Office for iOS, Android and Windows (this last a Store app) with similar features, and that more and more of the functionality of desktop Office will turn up in the mobile versions. I learned from my interview with Technical Product Manager Kaberi Chowdhury that ODF (Open Document) support is planned, as is some level of programmability.

The plans for Office are a clue to the company’s wider strategy, which is focused on cloud and server. Key products include Office 365, Windows Azure, Active Directory (and Azure Active Directory), SQL Server, SharePoint, and System Center as a management tool for hybrid cloud.

The Windows client strategy is to bring back users who disliked Windows 8 with a renewed focus on the desktop in the forthcoming Windows 10, while retaining the Store app model for apps that are secure, touch-friendly, and easily deployed. It is still not clear what Windows 10 phones and tablets will look like, but we can expect convergence; no more Windows RT, but perhaps tablets running Windows Phone OS that are in effect the next generation of Windows RT without a desktop personality.

The company will also hedge its bets with full app support for Office and its cloud services on iOS and Android, and in doing so will make its Windows mobile offerings less compelling.

Microsoft’s developer tools are changing in line with this strategy. The next generation of .NET is open source and cross-platform on the server side, for Windows, Mac and Linux. Xamarin plugs the gap for .NET on iOS and Android, while Microsoft is also adding native support (not .NET based) for cross-platform mobile in the next Visual Studio.

These are big changes to the developer stack, and Microsoft is forking .NET between the continuing Windows-only .NET Framework, and the new cross-platform .NET Core. Developers have many questions about this; see this interview on the Register for what I could glean about the current plans. Watch our for the Build conference at the end of April when the company will attempt to put it all together into a coherent whole for developers targeting either Windows 10, or cloud apps, or cloud services with cross-platform mobile clients.

This entire strategy is a logical progression from the company’s failure in mobile. Can it now succeed with client apps running on platforms controlled by its competitors? Alternatively, is there hope that Windows 10 can keep businesses hooked on Windows clients? Maybe 2015 will bring some answers, though with Windows 10 not expected until towards the end of the year there will be a long wait while iOS, Android and even Chrome OS (the operating system of Chromebook) continue to build.

A side effect is that C# now has a better chance of building a cross-platform user base, rather than being a Windows language. This has already happened in game development, thanks to the use of Mono and C# in the popular Unity game engine. Could it also happen with ASP.NET, deployed to Linux servers, now that this will be officially supported? Or is there little room for it alongside Java, PHP, Ruby, Node.js and the rest? 

The puzzle with Microsoft is that there is still too much mediocrity and complacency that damages the company’s offerings. How can it expect to succeed in the crowded wearable market with a band that is uncomfortable to wear? There is still an attitude in some parts of the company that the world will be happy to put up with problems that might be fixed in a future version after some long interval. Then again, the Azure team is doing great things and Windows server continues to impress. Win or lose, there will be plenty of Microsoft news this year.

A theme for 2015: cloud optimization

Late last year I attended Amazon’s re:Invent conference in Las Vegas; I wrote this up here. The key announcement for me was Amazon Aurora, a MySQL clone, not so much because of its merits as a cloud database server, but more because it represents a new breed of applications that are designed for the cloud. If you design database storage with the knowledge that it will only ever run on a huge cloud-scale infrastructure, you can make optimizations that cannot be replicated on smaller systems. I tried to summarize what this means in another Register piece here. The fact that this type of technology can be rented by any of us at commodity prices increases the advantage of public cloud, despite reservations that many still have concerning security and control. It also poses a challenge for companies like Oracle and Microsoft whose technology is designed for on-premises as well as cloud deployment; they cannot achieve the same advantage unless they fork their products, creating cloud variants that use different architecture.

The Sony hack

The cyber invasion of Sony Pictures in late November was not just another hack; it was a comprehensive takedown in which (as far as I can tell) the company’s entire IT systems were entirely compromised and significantly damaged.

According to this report:

Mountains of documents had been stolen, internal data centers had been wiped clean, and 75 percent of the servers had been destroyed.

Most IT admins worry about disaster recovery (what to do after catastrophic system failure such as a fire in your data center) as well as about security (what to do if hackers gain access to sensitive information). In this case, both seemed to happen simultaneously. Further, as producing movies is in effect a digital business, the business suffered loss of some of its actual products, such as the unreleased “Annie”.

The incident is fascinating in itself, especially as we do not know the identity of the hackers or their purpose, but what interests me more are the implications.

Specifically, how many companies are equally at risk? It seems clear that Sony’s security was towards the weak end of the scale, but there is plenty of weak security out there, especially but not exclusively in smaller businesses.

With the outcome of the Sony hack so spectacular, it is likely that there will be similar efforts in 2015, as well as many businesses looking nervously at their own practices and wondering what they can do to protect themselves.

Cloud may be part of the answer though even if the cloud provider does security right, that is no guarantee that their customers do the same.   

Looking back on looking back

Here is what I wrote a year or so ago, Reflecting on 2013- the year of not the PC, no privacy, and the Internet of Things. Most of it still applies. I have not achieved any of the three goals I set for myself though. Maybe this year…

SSD storage has come to Azure VMs, along with faster Azure SQL

Microsoft has introduced SSD storage for Azure VMs. This is a catch-up with Amazon which has been offering this at least since June 2014. It is an important feature though, and now in preview. The SSDs are part of the Azure storage service but can only be used for disks attached to VMs, not for general-purpose block files. There are three virtual disks available:

  P10 P20 P30
Disk size 128GB 512GB 1TB
IOPS 500 2300 5000
Throughput 100 MB/s 150 MB/s 200 MB/s

Price is $6.90 per 100GB per month, which if I am reading this right is less than Amazon’s $0.10 per GB per month ($10 per 100GB) as shown here.

One obvious use case is for SQL Server running on a VM. This generally performs better than Microsoft’s Azure SQL database service. That said, Microsoft is also previewing an improved Azure SQL which supports most of the features of SQL Server 2014, including .NET stored procedures and in-memory columnstore queries. Microsoft’s Scott Guthrie says performance is better:

Our internal benchmark tests (using over 600 million rows of data) show query performance improvements of around 5x with today’s preview relative to our existing Premium Tier SQL Database offering and up to 100x performance improvements when using the new In-memory columnstore technology.

If you can make it work, Azure SQL is better sense than running SQL Server in a VM with all the hassles of server patching and of course Microsoft’s licensing fees; but the performance has to be there. Another factor which drives users to the VM option is that SQL Reporting Service is not available in Azure SQL.

Quick reflections on Amazon re:Invent, open source, and Amazon Web Services

Last week I was in Las Vegas for my first visit to Amazon’s annual developer conference re:Invent. There were several announcements, the biggest being a new relational database service called RDS Aurora – a drop-in replacement for MySQL but with 3x write performance and 5x read performance as well as resiliency benefits – and EC2 Container Service, for deploying and managing Docker app containers. There is also AWS Lambda, a service which runs code in response to events.

You could read this news anywhere, but the advantage of being in Vegas was to immerse myself in the AWS culture and get to know the company better. Amazon is both distinctive and disruptive, and threes things that its retail operation and its web services have in common are large scale, commodity pricing, and customer focus.

Customer focus? Every company I have ever spoken to says it is customer focused, so what is different? Well, part of the press training at Amazon seems to be that when you ask about its future plans, the invariable answer is “what customers demand.” No doubt if you could eavesdrop at an Amazon executive meeting you would find that this is not entirely true, that there are matters of strategy and profitability which come into play, but this is the story the company wants us to hear. It also chimes with that of the retail operation, where customer service is generally excellent; the company would rather risk giving a refund or replacement to an undeserving customer and annoy its suppliers than vice versa. In the context of AWS this means something a bit different, but it does seem to me part of the company culture. “If enough customers keep asking for something, it’s very likely that we will respond to that,” marketing executive Paul Duffy told me.

That said, I would not describe Amazon as an especially open company, which is one reason I was glad to attend re:Invent. I was intrigued for example that Aurora is a drop-in replacement for an open source product, and wondered if it actually uses any of the MySQL code, though it seems unlikely since MySQL’s GPL license would require Amazon to publish its own code if it used any MySQL code; that said, the InnoDB storage engine code at least used to be available under a dual license so it is possible. When I asked Duffy though he said:

We don’t … at that level, that’s why we say it is compatible with MySQL. If you run the MySQL compatibility tool that will all check out. We don’t disclose anything about the inner workings of the service.

This of course touches on the issue of whether Amazon takes more from the open source community than it gives back.

image
Senior VP of AWS Andy Jassy

Someone asked Senior VP of AWS Andy Jassy, “what is your strategy of contributing to the open source ecosystem”, to which he replied:

We contribute to the open source ecosystem for many years. Zen, MySQL space, Linux space, we’re very active contributors, and will continue to do so in future.

That was it, that was the whole answer. Aurora, despite Duffy’s reticence, seems to be a completely new implementation of the MySQL API and builds on its success and popularity; could Amazon do more to share some of its breakthroughs with the open source community from which MySQL came? I think that is arguable; but Amazon is hard to hate since it tends to price so competitively.

Is Amazon worried about competition from Microsoft, Google, IBM or other cloud providers? I heard this question asked on several occasions, and the answer was generally along the lines that AWS is too busy to think about it. Again this is perhaps not the whole story, but it is true that AWS is growing fast and dominates the market to the extent that, say, Azure’s growth does not keep it awake at night. That said, you cannot accuse Amazon of complacency since it is adding new services and features at a high rate; 449 so far in 2014 according to VP and Distinguished Engineer James Hamilton, who also mentioned 99% usage growth in EC2 year on year, over 1,000,000 active customers, and 132% data transfer growth in the S3 storage service.

Cloud thinking

Hamilton’s session on AWS Innovation at Scale was among the most compelling of those I attended. His theme was that cloud computing is not just a bunch of hosted servers and services, but a new model of computing that enables new and better ways to run applications that are fast, resilient and scalable. Aurora is actually an example of this. Amazon has separated the storage engine from the relational engine, he explained, so that only deltas (the bits that have changed) are passed down for storage. The data is replicated 6 times across three Amazon availability zones, making it exceptionally resilient. You could not implement Aurora on-premises; only a cloud provider with huge scale can do it, according to Hamilton.

image
Distinguished Engineer James Hamilton

Hamilton was fascinating on the subject of networking gear – the cards, switches and routers that push bits across the network. Five years ago Amazon decided to build its own, partly because it considered the commercial products to be too expensive. Amazon developed its own custom network protocol stack. It worked out a lot cheaper, he said, since “even the support contract for networking gear was running into 10s of millions of dollars.” The company also found that reliability increased. Why was that? Hamilton quipped about how enterprise networking products evolve:

Enterprise customers give lots of complicated requirements to networking equipment producers who aggregate all these complicated requirements into 10s of billions of lines of code that can’t be maintained and that’s what gets delivered.

Amazon knew its own requirements and built for those alone. “Our gear is more reliable because we took on an easier problem,” he said.

AWS is also in a great position to analyse performance. It runs so much kit that it can see patterns of failure and where the bottlenecks lie. “We love metrics,” he said. There is an analogy with the way the popularity of Google search improves Google search; it is a virtuous circle that is hard for competitors can replicate.

Closing reflections

Like all vendor-specific conferences there was more marketing that I would have liked at re:Invent, but there is no doubting the excellence of the platform and its power to disrupt. There are aspects of public cloud that remain unsettling; things can go wrong and there will be nothing you can do but wait for them to be fixed. The benefits though are so great that it is worth the risk – though I would always advocate having some sort of plan B and off-cloud (or backup with another cloud provider) if that is feasible.